Dec 02 22:42:16 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 22:42:16 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:42:17 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 22:42:17 crc kubenswrapper[4696]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.246680 4696 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252143 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252174 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252181 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252187 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252192 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252200 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252205 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252211 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252218 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252224 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252231 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252240 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252247 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252253 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252258 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252263 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252269 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252274 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252279 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252284 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252289 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252296 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252301 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252305 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252310 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252315 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252320 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252326 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252331 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252336 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252341 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252357 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252363 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252369 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252376 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252383 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252390 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252396 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252402 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252407 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252412 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252417 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252424 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252429 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252434 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252440 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252445 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252450 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252455 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252460 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252465 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252471 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252476 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252481 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252488 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252494 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252500 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252506 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252512 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252518 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252523 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252528 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252533 4696 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252540 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252547 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252553 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252558 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252562 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252568 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252572 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.252578 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253116 4696 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253165 4696 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253176 4696 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253186 4696 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253195 4696 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253201 4696 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253210 4696 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253218 4696 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253225 4696 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253231 4696 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253238 4696 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253245 4696 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253251 4696 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253257 4696 flags.go:64] FLAG: --cgroup-root="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253263 4696 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253268 4696 flags.go:64] FLAG: --client-ca-file="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253274 4696 flags.go:64] FLAG: --cloud-config="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253279 4696 flags.go:64] FLAG: --cloud-provider="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253285 4696 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253294 4696 flags.go:64] FLAG: --cluster-domain="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253299 4696 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253305 4696 flags.go:64] FLAG: --config-dir="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253311 4696 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253317 4696 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253325 4696 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253331 4696 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253338 4696 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253344 4696 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253351 4696 flags.go:64] FLAG: --contention-profiling="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253356 4696 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253362 4696 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253371 4696 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253377 4696 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253384 4696 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253399 4696 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253406 4696 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253412 4696 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253418 4696 flags.go:64] FLAG: --enable-server="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253424 4696 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253432 4696 flags.go:64] FLAG: --event-burst="100" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253438 4696 flags.go:64] FLAG: --event-qps="50" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253444 4696 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253450 4696 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253457 4696 flags.go:64] FLAG: --eviction-hard="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253466 4696 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253472 4696 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253478 4696 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253484 4696 flags.go:64] FLAG: --eviction-soft="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253490 4696 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253495 4696 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253501 4696 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253506 4696 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253512 4696 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253518 4696 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253524 4696 flags.go:64] FLAG: --feature-gates="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253531 4696 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253537 4696 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253543 4696 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253549 4696 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253555 4696 flags.go:64] FLAG: --healthz-port="10248" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253561 4696 flags.go:64] FLAG: --help="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253567 4696 flags.go:64] FLAG: --hostname-override="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253572 4696 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253579 4696 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253585 4696 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253591 4696 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253597 4696 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253603 4696 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253609 4696 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253615 4696 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253621 4696 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253627 4696 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253633 4696 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253639 4696 flags.go:64] FLAG: --kube-reserved="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253645 4696 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253650 4696 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253656 4696 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253662 4696 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253667 4696 flags.go:64] FLAG: --lock-file="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253674 4696 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253680 4696 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253686 4696 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253695 4696 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253701 4696 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253707 4696 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253713 4696 flags.go:64] FLAG: --logging-format="text" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253719 4696 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253725 4696 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253731 4696 flags.go:64] FLAG: --manifest-url="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253759 4696 flags.go:64] FLAG: --manifest-url-header="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253768 4696 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253775 4696 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253782 4696 flags.go:64] FLAG: --max-pods="110" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253788 4696 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253795 4696 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253803 4696 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253811 4696 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253819 4696 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253826 4696 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253833 4696 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253851 4696 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253858 4696 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253865 4696 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253872 4696 flags.go:64] FLAG: --pod-cidr="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253878 4696 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253891 4696 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253897 4696 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253904 4696 flags.go:64] FLAG: --pods-per-core="0" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253910 4696 flags.go:64] FLAG: --port="10250" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253916 4696 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253922 4696 flags.go:64] FLAG: --provider-id="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253928 4696 flags.go:64] FLAG: --qos-reserved="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253934 4696 flags.go:64] FLAG: --read-only-port="10255" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253940 4696 flags.go:64] FLAG: --register-node="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253946 4696 flags.go:64] FLAG: --register-schedulable="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253953 4696 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253965 4696 flags.go:64] FLAG: --registry-burst="10" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253971 4696 flags.go:64] FLAG: --registry-qps="5" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253976 4696 flags.go:64] FLAG: --reserved-cpus="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253982 4696 flags.go:64] FLAG: --reserved-memory="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.253997 4696 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254003 4696 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254009 4696 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254015 4696 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254020 4696 flags.go:64] FLAG: --runonce="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254026 4696 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254032 4696 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254040 4696 flags.go:64] FLAG: --seccomp-default="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254047 4696 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254053 4696 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254059 4696 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254065 4696 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254072 4696 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254078 4696 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254084 4696 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254091 4696 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254096 4696 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254102 4696 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254108 4696 flags.go:64] FLAG: --system-cgroups="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254114 4696 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254123 4696 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254129 4696 flags.go:64] FLAG: --tls-cert-file="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254135 4696 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254142 4696 flags.go:64] FLAG: --tls-min-version="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254148 4696 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254154 4696 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254160 4696 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254166 4696 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254172 4696 flags.go:64] FLAG: --v="2" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254180 4696 flags.go:64] FLAG: --version="false" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254189 4696 flags.go:64] FLAG: --vmodule="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254201 4696 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254208 4696 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254360 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254367 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254373 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254378 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254383 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254390 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254396 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254401 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254406 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254411 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254416 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254421 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254425 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254430 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254435 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254440 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254445 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254450 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254457 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254463 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254468 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254474 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254479 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254484 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254489 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254493 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254498 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254503 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254508 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254512 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254517 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254523 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254529 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254534 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254540 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254545 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254551 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254560 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254565 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254571 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254576 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254581 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254588 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254594 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254599 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254604 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254610 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254616 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254621 4696 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254627 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254631 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254636 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254642 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254647 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254652 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254658 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254662 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254668 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254672 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254677 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254682 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254687 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254692 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254697 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254702 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254708 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254714 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254719 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254724 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254732 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.254763 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.254782 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.268467 4696 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.268511 4696 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268713 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268735 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268780 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268793 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268804 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268816 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268830 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268841 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268853 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268864 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268875 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268886 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268896 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268906 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268915 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268926 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268936 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268947 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268958 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268970 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268981 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.268993 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269005 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269015 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269026 4696 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269041 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269055 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269068 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269082 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269095 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269107 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269118 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269127 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269170 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269178 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269188 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269196 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269205 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269218 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269229 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269240 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269250 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269260 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269270 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269280 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269289 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269298 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269310 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269322 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269333 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269343 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269353 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269363 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269374 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269386 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269396 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269406 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269415 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269425 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269434 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269443 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269456 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269465 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269474 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269483 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269493 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269503 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269512 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269521 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269529 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269538 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.269552 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269935 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269956 4696 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269969 4696 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269982 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.269995 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270004 4696 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270016 4696 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270028 4696 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270039 4696 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270050 4696 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270061 4696 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270070 4696 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270078 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270087 4696 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270096 4696 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270104 4696 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270113 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270123 4696 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270137 4696 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270152 4696 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270167 4696 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270179 4696 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270191 4696 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270203 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270214 4696 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270230 4696 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270243 4696 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270254 4696 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270267 4696 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270277 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270286 4696 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270295 4696 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270303 4696 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270312 4696 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270321 4696 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270331 4696 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270340 4696 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270349 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270357 4696 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270365 4696 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270374 4696 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270382 4696 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270390 4696 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270399 4696 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270407 4696 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270416 4696 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270425 4696 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270433 4696 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270443 4696 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270451 4696 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270460 4696 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270468 4696 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270477 4696 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270485 4696 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270493 4696 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270502 4696 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270510 4696 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270518 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270527 4696 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270535 4696 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270543 4696 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270554 4696 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270562 4696 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270570 4696 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270579 4696 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270588 4696 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270599 4696 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270610 4696 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270619 4696 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270628 4696 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.270637 4696 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.270650 4696 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.271254 4696 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.276586 4696 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.276795 4696 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.277820 4696 server.go:997] "Starting client certificate rotation" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.277878 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.278408 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-17 13:51:31.06218909 +0000 UTC Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.278619 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.285890 4696 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.287853 4696 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.289573 4696 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.298114 4696 log.go:25] "Validated CRI v1 runtime API" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.322638 4696 log.go:25] "Validated CRI v1 image API" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.325232 4696 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.328084 4696 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-22-36-38-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.328126 4696 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.347906 4696 manager.go:217] Machine: {Timestamp:2025-12-02 22:42:17.34634374 +0000 UTC m=+0.227023781 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:be4c1bf2-c508-4b46-be45-7efaea566193 BootID:b680025d-da08-4b46-a4a4-b21ac19e4f7b Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ac:15:c9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ac:15:c9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:16:e2:2c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9f:e2:35 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ef:5d:6a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:35:8b:f2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:ed:7e:99:04:2f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:ee:ae:ff:0f:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.348222 4696 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.348476 4696 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.349636 4696 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350044 4696 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350095 4696 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350365 4696 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350380 4696 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350644 4696 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350695 4696 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.350960 4696 state_mem.go:36] "Initialized new in-memory state store" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351084 4696 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351840 4696 kubelet.go:418] "Attempting to sync node with API server" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351869 4696 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351902 4696 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351919 4696 kubelet.go:324] "Adding apiserver pod source" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.351936 4696 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.353996 4696 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.354920 4696 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.354841 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.355066 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.355231 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.355057 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356127 4696 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356705 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356729 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356737 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356757 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356770 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356778 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356787 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356801 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356813 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356822 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356835 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.356843 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.357246 4696 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.357875 4696 server.go:1280] "Started kubelet" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.358453 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.358529 4696 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.359815 4696 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 22:42:17 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.363262 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d873fdce5be25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 22:42:17.357835813 +0000 UTC m=+0.238515814,LastTimestamp:2025-12-02 22:42:17.357835813 +0000 UTC m=+0.238515814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.364245 4696 server.go:460] "Adding debug handlers to kubelet server" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.364350 4696 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.366868 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.366930 4696 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.369334 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 23:21:11.138235246 +0000 UTC Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.369621 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.369856 4696 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.369891 4696 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.370346 4696 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.370593 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.370801 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.371942 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.373286 4696 factory.go:55] Registering systemd factory Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.373321 4696 factory.go:221] Registration of the systemd container factory successfully Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.373941 4696 factory.go:153] Registering CRI-O factory Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.373962 4696 factory.go:221] Registration of the crio container factory successfully Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.374032 4696 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.374059 4696 factory.go:103] Registering Raw factory Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.374079 4696 manager.go:1196] Started watching for new ooms in manager Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.374810 4696 manager.go:319] Starting recovery of all containers Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393298 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393380 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393410 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393430 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393453 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393475 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393496 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393519 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393543 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393566 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393581 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393597 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393616 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393642 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393657 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393673 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393690 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393708 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393733 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393768 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393785 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393809 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393835 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393859 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393872 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393889 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393913 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393946 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393964 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.393986 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394001 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394017 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394038 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394054 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394073 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394089 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394105 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394126 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394143 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394162 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394177 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394193 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394211 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394228 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394245 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394268 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394287 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394311 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394329 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394346 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394367 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394380 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394407 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394430 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394450 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394473 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394491 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394511 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394527 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394547 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394564 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.394799 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.395062 4696 manager.go:324] Recovery completed Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.395019 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.395834 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396076 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396105 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396140 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396200 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396365 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396398 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396421 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396448 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396469 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396546 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.396600 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397259 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397300 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397339 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397402 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397482 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397516 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397557 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.397588 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398862 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398907 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398925 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398942 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398958 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398974 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.398990 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399008 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399025 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399041 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399056 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399071 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399087 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399103 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399135 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399150 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399166 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399181 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399196 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.399213 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400055 4696 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400103 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400141 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400165 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400187 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400206 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400225 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400245 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400264 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400283 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400300 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400317 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400338 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400354 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400389 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400407 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400421 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400436 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400451 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400466 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400482 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400497 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400513 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400527 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400541 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400554 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400569 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400584 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400607 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400621 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400636 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400650 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400666 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400679 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400694 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400707 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400723 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400754 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400769 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400786 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400802 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400816 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400831 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400845 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400858 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400878 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400892 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400905 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400920 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400934 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400949 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400963 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400978 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.400991 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401005 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401021 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401036 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401050 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401064 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401079 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401093 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401107 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401122 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401139 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401154 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401167 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401204 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401218 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401232 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401247 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401261 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401273 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401287 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401303 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401316 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401329 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401346 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401361 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401376 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401391 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401407 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401421 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401436 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401452 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401468 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401485 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401499 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401517 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401534 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401549 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401563 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401581 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401598 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401614 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401630 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401646 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401662 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401676 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401690 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401707 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401723 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401752 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401799 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401819 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401841 4696 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401854 4696 reconstruct.go:97] "Volume reconstruction finished" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.401863 4696 reconciler.go:26] "Reconciler: start to sync state" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.407564 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.410121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.410163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.410175 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.411436 4696 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.411451 4696 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.411470 4696 state_mem.go:36] "Initialized new in-memory state store" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.424341 4696 policy_none.go:49] "None policy: Start" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.427160 4696 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.429543 4696 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.429573 4696 state_mem.go:35] "Initializing new in-memory state store" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.430193 4696 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.430347 4696 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.430398 4696 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.430464 4696 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 22:42:17 crc kubenswrapper[4696]: W1202 22:42:17.432039 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.432135 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.469812 4696 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.530561 4696 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.531814 4696 manager.go:334] "Starting Device Plugin manager" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.531897 4696 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.531915 4696 server.go:79] "Starting device plugin registration server" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.532511 4696 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.532540 4696 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.532719 4696 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.532850 4696 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.532868 4696 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.539522 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.573956 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.632732 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.634820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.634894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.634927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.634967 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.635778 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.731349 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.731660 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.733479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.733555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.733578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.733848 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.734578 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.734670 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.735644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.735718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.735767 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736065 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736190 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736245 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736308 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.736322 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.737453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.737502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.737520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.738457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.738524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.738545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.739113 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.739330 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.739393 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.740965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741011 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741026 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741214 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.741508 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.742034 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.742106 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.743440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.743499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.743514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.744446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.744499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.744587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.744912 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.744970 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.746147 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.746202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.746224 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.806983 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807039 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807070 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807093 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807118 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807149 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807173 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807197 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807307 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807389 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807455 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807538 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807631 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807802 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.807847 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.836023 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.838223 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.838448 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.838576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.838717 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.840579 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909104 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909194 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909263 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909324 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909344 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909422 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909465 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909511 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909548 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909572 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.909614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910131 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910218 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910193 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910241 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910314 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910317 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910375 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910283 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910134 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910407 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910378 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910316 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: I1202 22:42:17.910511 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:42:17 crc kubenswrapper[4696]: E1202 22:42:17.975139 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.065227 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.095132 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.095523 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-aaf85c7a7db986deb3633a03c9131985397ac38bf69bda1a673fedda006f8a92 WatchSource:0}: Error finding container aaf85c7a7db986deb3633a03c9131985397ac38bf69bda1a673fedda006f8a92: Status 404 returned error can't find the container with id aaf85c7a7db986deb3633a03c9131985397ac38bf69bda1a673fedda006f8a92 Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.118971 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.134421 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.138546 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e12f3623c63ace64d922d0f6377b4199d438606849f07d31f52e214ed684a0b1 WatchSource:0}: Error finding container e12f3623c63ace64d922d0f6377b4199d438606849f07d31f52e214ed684a0b1: Status 404 returned error can't find the container with id e12f3623c63ace64d922d0f6377b4199d438606849f07d31f52e214ed684a0b1 Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.143624 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.149165 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-544986d07a2773374c1a967a6f64eedef220929dd02fb3929fae02f5d57cbeb6 WatchSource:0}: Error finding container 544986d07a2773374c1a967a6f64eedef220929dd02fb3929fae02f5d57cbeb6: Status 404 returned error can't find the container with id 544986d07a2773374c1a967a6f64eedef220929dd02fb3929fae02f5d57cbeb6 Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.167580 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5eedde3ef76db3e2bfce7137762f69060aa2f1db67ec42a54c445cf8588ece99 WatchSource:0}: Error finding container 5eedde3ef76db3e2bfce7137762f69060aa2f1db67ec42a54c445cf8588ece99: Status 404 returned error can't find the container with id 5eedde3ef76db3e2bfce7137762f69060aa2f1db67ec42a54c445cf8588ece99 Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.239139 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.239303 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.241583 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.244478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.244532 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.244552 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.244595 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.245313 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.361167 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.370150 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 02:35:43.034022898 +0000 UTC Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.370251 4696 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 387h53m24.663774392s for next certificate rotation Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.436030 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"aaf85c7a7db986deb3633a03c9131985397ac38bf69bda1a673fedda006f8a92"} Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.437195 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5eedde3ef76db3e2bfce7137762f69060aa2f1db67ec42a54c445cf8588ece99"} Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.438708 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"544986d07a2773374c1a967a6f64eedef220929dd02fb3929fae02f5d57cbeb6"} Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.439679 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e12f3623c63ace64d922d0f6377b4199d438606849f07d31f52e214ed684a0b1"} Dec 02 22:42:18 crc kubenswrapper[4696]: I1202 22:42:18.440520 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2148a1def06c638855601cc9774a3083859fd70ce62eb14882ec6e36069f404"} Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.454563 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.454661 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.666067 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.666433 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.777189 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Dec 02 22:42:18 crc kubenswrapper[4696]: W1202 22:42:18.854654 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:18 crc kubenswrapper[4696]: E1202 22:42:18.854840 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.046161 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.048225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.048298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.048318 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.048354 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:19 crc kubenswrapper[4696]: E1202 22:42:19.048986 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.361560 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.448089 4696 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a" exitCode=0 Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.448206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a"} Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.448291 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.449310 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:42:19 crc kubenswrapper[4696]: E1202 22:42:19.451087 4696 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.452847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.452903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.452922 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.456667 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79"} Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.460228 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5" exitCode=0 Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.460336 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5"} Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.460428 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.462061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.462119 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.462141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.463710 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e" exitCode=0 Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.463979 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.464017 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e"} Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.465647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.465718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.465777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.465947 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.466394 4696 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224" exitCode=0 Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.466428 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224"} Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.466501 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:19 crc kubenswrapper[4696]: I1202 22:42:19.468471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.359011 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d873fdce5be25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 22:42:17.357835813 +0000 UTC m=+0.238515814,LastTimestamp:2025-12-02 22:42:17.357835813 +0000 UTC m=+0.238515814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.361557 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.378654 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.473287 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d" exitCode=0 Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.473407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d"} Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.473463 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.475020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.475078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.475097 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.477549 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.477604 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"eaa2f02ac937663ab0258468d91f90599f4d6c2d1f68e36a71c6f9424289562a"} Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.479087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.479146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.479167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.481688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7"} Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.484855 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800"} Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.487604 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53"} Dec 02 22:42:20 crc kubenswrapper[4696]: W1202 22:42:20.615602 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.615727 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:20 crc kubenswrapper[4696]: W1202 22:42:20.633815 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.633901 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.649074 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.651100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.651152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.651170 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:20 crc kubenswrapper[4696]: I1202 22:42:20.651208 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.651727 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Dec 02 22:42:20 crc kubenswrapper[4696]: W1202 22:42:20.920976 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:20 crc kubenswrapper[4696]: E1202 22:42:20.921129 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:21 crc kubenswrapper[4696]: W1202 22:42:21.310063 4696 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:21 crc kubenswrapper[4696]: E1202 22:42:21.310184 4696 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:42:21 crc kubenswrapper[4696]: I1202 22:42:21.361304 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Dec 02 22:42:21 crc kubenswrapper[4696]: I1202 22:42:21.490860 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:21 crc kubenswrapper[4696]: I1202 22:42:21.492486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:21 crc kubenswrapper[4696]: I1202 22:42:21.492534 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:21 crc kubenswrapper[4696]: I1202 22:42:21.492551 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.497199 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.497262 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.497282 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.498366 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.498396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.498405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.500558 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.500587 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.500660 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.501609 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.501628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.501636 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.504362 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.504386 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.504398 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.506487 4696 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b" exitCode=0 Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.506520 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b"} Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.506598 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.507327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.507355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:22 crc kubenswrapper[4696]: I1202 22:42:22.507363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.513931 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764"} Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.513995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b"} Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.517666 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e"} Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.517806 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.517999 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.518790 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.518846 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.518807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.518927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.518941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.519570 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.612848 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.753903 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.852533 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.854181 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.854243 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.854260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:23 crc kubenswrapper[4696]: I1202 22:42:23.854294 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.532725 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16"} Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.532841 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1"} Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.533372 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.533447 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.534546 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.534584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.534594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.648930 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.649177 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.650405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.650476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.650517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.677076 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.759871 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.760148 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.761839 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.761903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.761922 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:24 crc kubenswrapper[4696]: I1202 22:42:24.859918 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.542158 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543055 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895"} Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543128 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543185 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543254 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543765 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543824 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.543839 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544921 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544926 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:25 crc kubenswrapper[4696]: I1202 22:42:25.544966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.021374 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.544997 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.545092 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.546645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.546700 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.546772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.546972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.547017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:26 crc kubenswrapper[4696]: I1202 22:42:26.547034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.119091 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.119268 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.119333 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.121247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.121456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.121633 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.196867 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.334074 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:27 crc kubenswrapper[4696]: E1202 22:42:27.539808 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.547593 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.547780 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.548720 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.549589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.549658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.549689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550092 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:27 crc kubenswrapper[4696]: I1202 22:42:27.550280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:29 crc kubenswrapper[4696]: I1202 22:42:29.125631 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 22:42:29 crc kubenswrapper[4696]: I1202 22:42:29.125949 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:29 crc kubenswrapper[4696]: I1202 22:42:29.127787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:29 crc kubenswrapper[4696]: I1202 22:42:29.127880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:29 crc kubenswrapper[4696]: I1202 22:42:29.127900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.360270 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.360503 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.362319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.362369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.362383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.364158 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.557419 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.558685 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.558766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:30 crc kubenswrapper[4696]: I1202 22:42:30.558787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:32 crc kubenswrapper[4696]: I1202 22:42:32.362460 4696 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 22:42:32 crc kubenswrapper[4696]: I1202 22:42:32.956319 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 22:42:32 crc kubenswrapper[4696]: I1202 22:42:32.956420 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 22:42:32 crc kubenswrapper[4696]: I1202 22:42:32.967939 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 22:42:32 crc kubenswrapper[4696]: I1202 22:42:32.968038 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 22:42:33 crc kubenswrapper[4696]: I1202 22:42:33.360643 4696 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 22:42:33 crc kubenswrapper[4696]: I1202 22:42:33.360807 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.495770 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.495863 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.870065 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.870395 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.871000 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.871106 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.872951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.873018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.873040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:34 crc kubenswrapper[4696]: I1202 22:42:34.878945 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.680379 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.682082 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.682198 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.684541 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.684703 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:35 crc kubenswrapper[4696]: I1202 22:42:35.684733 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:36 crc kubenswrapper[4696]: I1202 22:42:36.021983 4696 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 22:42:36 crc kubenswrapper[4696]: I1202 22:42:36.022083 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 22:42:37 crc kubenswrapper[4696]: E1202 22:42:37.540515 4696 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 22:42:37 crc kubenswrapper[4696]: E1202 22:42:37.958055 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.959990 4696 trace.go:236] Trace[1470835569]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:42:25.156) (total time: 12803ms): Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[1470835569]: ---"Objects listed" error: 12803ms (22:42:37.959) Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[1470835569]: [12.803828523s] [12.803828523s] END Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.960029 4696 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.960102 4696 trace.go:236] Trace[2038932611]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:42:24.587) (total time: 13372ms): Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[2038932611]: ---"Objects listed" error: 13372ms (22:42:37.959) Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[2038932611]: [13.372155562s] [13.372155562s] END Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.960132 4696 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 22:42:37 crc kubenswrapper[4696]: E1202 22:42:37.960168 4696 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.961075 4696 trace.go:236] Trace[531537780]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:42:25.869) (total time: 12091ms): Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[531537780]: ---"Objects listed" error: 12091ms (22:42:37.961) Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[531537780]: [12.091264241s] [12.091264241s] END Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.961099 4696 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.961515 4696 trace.go:236] Trace[1010393662]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:42:25.620) (total time: 12340ms): Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[1010393662]: ---"Objects listed" error: 12340ms (22:42:37.961) Dec 02 22:42:37 crc kubenswrapper[4696]: Trace[1010393662]: [12.340503371s] [12.340503371s] END Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.961541 4696 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.963101 4696 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.966890 4696 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.988476 4696 csr.go:261] certificate signing request csr-klhr4 is approved, waiting to be issued Dec 02 22:42:37 crc kubenswrapper[4696]: I1202 22:42:37.995938 4696 csr.go:257] certificate signing request csr-klhr4 is issued Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.365358 4696 apiserver.go:52] "Watching apiserver" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.386173 4696 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.386542 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.387032 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.387154 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.387192 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.387225 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.387417 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.387505 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.389077 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.389690 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.389252 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393549 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393568 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393552 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393624 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393643 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.393570 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.394006 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.394190 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.394337 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.428459 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.446934 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.459373 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.471828 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.472273 4696 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.487077 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.498227 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.510941 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.567519 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.567991 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568308 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568430 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568469 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568498 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568525 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568554 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568582 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568610 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568643 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568670 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568726 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568778 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568810 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568949 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.568981 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569007 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569113 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569104 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569172 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569196 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569213 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569231 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569250 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569288 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569305 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569322 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569339 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569357 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569374 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569389 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569407 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569424 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569441 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569458 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569474 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569493 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569510 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569529 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569544 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569598 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569616 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569632 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569650 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569666 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569684 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569718 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569758 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569782 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569804 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569128 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569214 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569443 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573024 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569530 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569623 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569688 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569688 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.569835 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:39.069809744 +0000 UTC m=+21.950489755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.569994 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570083 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570351 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570699 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570801 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.570839 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.571090 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.571501 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.571626 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.571834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572159 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572303 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572387 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573351 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573361 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573378 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573431 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573468 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573511 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573547 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573577 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573607 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573640 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573669 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573696 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573720 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573767 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573791 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573815 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573836 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572388 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573862 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573889 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573917 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573941 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573968 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573990 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574011 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574040 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574078 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574102 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574124 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574146 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574167 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574195 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574216 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574238 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574290 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574316 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574337 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574359 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574386 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574406 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574427 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574450 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574472 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574495 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574523 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574544 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574569 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574591 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574611 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574634 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574654 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574712 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574736 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575277 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575301 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575323 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575346 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575417 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575469 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575495 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575520 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575542 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575563 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575584 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575607 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575632 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575654 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575680 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575703 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575724 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575764 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575788 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575810 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575832 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575878 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575900 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575922 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575946 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575970 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575993 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576019 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576042 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576065 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576088 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576111 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576133 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576162 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576186 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576208 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576232 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576256 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576280 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576304 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576328 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576350 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576374 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576403 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576427 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576452 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576478 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576500 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576525 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576552 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576575 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576600 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576623 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576648 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576673 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576717 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576758 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576785 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576814 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576839 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576862 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576886 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576910 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576933 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576960 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577010 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577036 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577060 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577098 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577123 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577147 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577169 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577193 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577223 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577252 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577280 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577307 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577334 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577359 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577410 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577445 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577472 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577497 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577522 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577548 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577577 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577600 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577656 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577695 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577722 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577771 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577810 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577838 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577865 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577899 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577963 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577991 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578018 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578042 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578068 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578140 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578157 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578174 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578189 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578203 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578218 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578233 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578247 4696 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578261 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578275 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578290 4696 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578303 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578319 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578333 4696 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578364 4696 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578387 4696 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578402 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578417 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578431 4696 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578446 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578459 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578472 4696 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578486 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578500 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578514 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578530 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578544 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578558 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.579871 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572802 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572927 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572941 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572976 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573263 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573456 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.572399 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573614 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573630 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573832 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.573958 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574012 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574330 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.574696 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575456 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575613 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.575632 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576138 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576268 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576633 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.576714 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577180 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577242 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577274 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583907 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577538 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577733 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577784 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.577993 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578041 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578142 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578404 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578499 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578628 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.578648 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.579134 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.579173 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.579283 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580107 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580143 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580274 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580580 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.580658 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.581243 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.581299 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.582568 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.582616 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.582669 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.582694 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583161 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583222 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583370 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583527 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583756 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.583914 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.584479 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.584622 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.584735 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.585031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.585654 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.585724 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.585827 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586263 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586333 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586416 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586452 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586662 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.586843 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587029 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587255 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587292 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587372 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587724 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587780 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587814 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587912 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.587833 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.588342 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.588380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.588448 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.588560 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589282 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589376 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589469 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.589574 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.589666 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589701 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.589710 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:39.089679681 +0000 UTC m=+21.970359722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.589809 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:39.089771364 +0000 UTC m=+21.970451375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.589910 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590201 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590303 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590334 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590821 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.592787 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590813 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590872 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590865 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590979 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.591376 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.591709 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590710 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.591855 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.592145 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.592654 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.592896 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.593217 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.593647 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.593920 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.594075 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.595126 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.595350 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.590834 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.591291 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.592688 4696 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.601777 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602187 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602344 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602561 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602643 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602663 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602877 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602953 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.602978 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.604700 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.604907 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.606033 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.606979 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.607522 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.607551 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.607568 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.607635 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:39.107613084 +0000 UTC m=+21.988293095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.612875 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.614063 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.614225 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.615072 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.619262 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.619410 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.619488 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.619505 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:38 crc kubenswrapper[4696]: E1202 22:42:38.619586 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:39.119557749 +0000 UTC m=+22.000237740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.620362 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.620147 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.620468 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.620922 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.620897 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.622243 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.622465 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.627199 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.631801 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.632150 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.632526 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.632701 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.631976 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.632866 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.633462 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.634014 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.634054 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.634292 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.634927 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.635278 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.635890 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.635984 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.638844 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.639957 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.640201 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.642425 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.642573 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.642783 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.642951 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.643057 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.643215 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.643279 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.643482 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.643882 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.645185 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.647008 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.651402 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683511 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683554 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683596 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683610 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683621 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683630 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683639 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683648 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683657 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683668 4696 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683676 4696 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683686 4696 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683694 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683704 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683713 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683722 4696 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683731 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683759 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683768 4696 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683777 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683787 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683795 4696 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683804 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683818 4696 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683827 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683836 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683847 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683857 4696 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683866 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683875 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683883 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683892 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683901 4696 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683909 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683918 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683927 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683939 4696 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683949 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683961 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683972 4696 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683984 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.683993 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684003 4696 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684013 4696 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684022 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684031 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684040 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684049 4696 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684058 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684067 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684076 4696 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684085 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684094 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684102 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684111 4696 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684121 4696 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684130 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684139 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684148 4696 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684157 4696 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684165 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684175 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684186 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684196 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684204 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684213 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684222 4696 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684231 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684241 4696 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684249 4696 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684258 4696 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684266 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684275 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684284 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684292 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684302 4696 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684311 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684321 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684332 4696 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684341 4696 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684352 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684360 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684369 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684378 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684387 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684395 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684404 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684412 4696 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684420 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684429 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684438 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684446 4696 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684455 4696 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684464 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684476 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684485 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684494 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684502 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684511 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684519 4696 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684527 4696 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684536 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684545 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684555 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684564 4696 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684573 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684581 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684590 4696 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684601 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684610 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684618 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684629 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684638 4696 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684647 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684656 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684664 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684673 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684681 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684690 4696 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684698 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684707 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684716 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684724 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684732 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684755 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684763 4696 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684773 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684782 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684791 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684799 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684808 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684817 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684825 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684835 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684844 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684852 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684862 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684871 4696 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684879 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684889 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684897 4696 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684907 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684916 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684926 4696 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684935 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684944 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684953 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684963 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684971 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684980 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684988 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.684999 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685007 4696 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685016 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685024 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685033 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685042 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685052 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685060 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685068 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685076 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685084 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685093 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685101 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685205 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.685377 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.697977 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.703842 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.705113 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.709691 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.712723 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.712820 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e" exitCode=255 Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.712866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e"} Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.717258 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.722533 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.724050 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.736611 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.746601 4696 scope.go:117] "RemoveContainer" containerID="6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.747205 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.755920 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.780369 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.786381 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.786415 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.786426 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.786440 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.800056 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.814127 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.826803 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.998137 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-02 22:37:37 +0000 UTC, rotation deadline is 2026-09-09 02:08:55.394176594 +0000 UTC Dec 02 22:42:38 crc kubenswrapper[4696]: I1202 22:42:38.998233 4696 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6723h26m16.395945865s for next certificate rotation Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.089385 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.089674 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:40.089636542 +0000 UTC m=+22.970316543 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.166970 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.184090 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.184254 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.189953 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.189996 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.190017 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.190043 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190169 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190172 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190230 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190185 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190328 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190345 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:40.190306405 +0000 UTC m=+23.070986446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190382 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:40.190365927 +0000 UTC m=+23.071045928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.190402 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:40.190394537 +0000 UTC m=+23.071074538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.190859 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.191009 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.191024 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.191038 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:39 crc kubenswrapper[4696]: E1202 22:42:39.191305 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:40.191282742 +0000 UTC m=+23.071962963 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.203507 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.232223 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.270885 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.305098 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.317878 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.331642 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.345486 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.365078 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.379713 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.394944 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.408806 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.428663 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.436492 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.437683 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.440237 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.441661 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.443919 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.445098 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.446497 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.446581 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.450504 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.452091 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.454176 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.455345 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.457623 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.458691 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.459275 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.459410 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.461025 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.461721 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.462480 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.463500 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.464270 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.465580 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.466169 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.466895 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.467964 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.468760 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.469874 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.470568 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.471817 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.472391 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.473265 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.474232 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.474679 4696 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.474792 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.476815 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.477439 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.477902 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.480077 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.480686 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.481674 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.482351 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.483375 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.484166 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.484906 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.486079 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.487035 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.487514 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.488571 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.489367 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.490590 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.491142 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.492138 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.492709 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.493287 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.494292 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.494888 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.716707 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"36f70ca25f5a8db42227aa92ad1b004055d98eb689e21069d00bdf792a254f45"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.718597 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.718681 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eb2650f3baf4a1c0b22fadf91b6bdf0a086a1fbb677d55ee1f3e2ffe5c56e827"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.720487 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.722507 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.722794 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.724461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.724486 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.724497 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"58011b3252f3ccf8d0a6761302d8a0dab7d0458a6f4480e02a5595200c53e9d2"} Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.753319 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.771928 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.822163 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.845576 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.890425 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.925196 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:39 crc kubenswrapper[4696]: I1202 22:42:39.990593 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:39Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.030551 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.076007 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.099985 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.100239 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.100192602 +0000 UTC m=+24.980872603 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.102522 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.121598 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.125820 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f57qk"] Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.126129 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wthxr"] Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.126326 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.126362 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.129091 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.129502 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.129836 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.130030 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.130366 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.130541 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.130564 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.132054 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.143686 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.164767 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.202825 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.202890 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-system-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.202922 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cni-binary-copy\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.202952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-netns\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.202982 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf8ea236-9be9-4eb2-904a-103c4c279f28-hosts-file\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203005 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-socket-dir-parent\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203063 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203084 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-kubelet\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-daemon-config\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203128 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-conf-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203151 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-multus-certs\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203178 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203205 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-bin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203226 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-multus\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203250 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-hostroot\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203271 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cnibin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203291 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-k8s-cni-cncf-io\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203315 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxprw\" (UniqueName: \"kubernetes.io/projected/cf8ea236-9be9-4eb2-904a-103c4c279f28-kube-api-access-vxprw\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203337 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203360 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-os-release\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-etc-kubernetes\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.203407 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqg5z\" (UniqueName: \"kubernetes.io/projected/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-kube-api-access-xqg5z\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203567 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203591 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203605 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203655 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.203635973 +0000 UTC m=+25.084315984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203767 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203803 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.203792587 +0000 UTC m=+25.084472598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203863 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203876 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203887 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203912 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.20390442 +0000 UTC m=+25.084584431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.203993 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.204023 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.204014193 +0000 UTC m=+25.084694204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.207330 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.219478 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.232264 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.253240 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.272421 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.290471 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.302727 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-socket-dir-parent\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303893 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-kubelet\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303914 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-daemon-config\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303942 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-bin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303971 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-multus\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-kubelet\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304030 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-bin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304036 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-hostroot\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.303991 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-hostroot\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304064 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-var-lib-cni-multus\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-conf-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304112 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-multus-certs\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304131 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cnibin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304147 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-k8s-cni-cncf-io\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304143 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-conf-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304172 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxprw\" (UniqueName: \"kubernetes.io/projected/cf8ea236-9be9-4eb2-904a-103c4c279f28-kube-api-access-vxprw\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304187 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-k8s-cni-cncf-io\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304197 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304222 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-os-release\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304245 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-etc-kubernetes\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqg5z\" (UniqueName: \"kubernetes.io/projected/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-kube-api-access-xqg5z\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304272 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-multus-certs\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cni-binary-copy\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304358 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-etc-kubernetes\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304404 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-netns\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-system-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304440 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304533 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf8ea236-9be9-4eb2-904a-103c4c279f28-hosts-file\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-host-run-netns\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf8ea236-9be9-4eb2-904a-103c4c279f28-hosts-file\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304585 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-system-cni-dir\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304537 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cnibin\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304718 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-daemon-config\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304769 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-os-release\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304789 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-multus-socket-dir-parent\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.304942 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-cni-binary-copy\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.318389 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.323197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqg5z\" (UniqueName: \"kubernetes.io/projected/86a37d2a-37c5-4fbd-b10b-f5e4706772f4-kube-api-access-xqg5z\") pod \"multus-wthxr\" (UID: \"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\") " pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.325075 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxprw\" (UniqueName: \"kubernetes.io/projected/cf8ea236-9be9-4eb2-904a-103c4c279f28-kube-api-access-vxprw\") pod \"node-resolver-f57qk\" (UID: \"cf8ea236-9be9-4eb2-904a-103c4c279f28\") " pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.333827 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.347102 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.364302 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.367897 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.376834 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.387673 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.409514 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.422787 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.431312 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.431378 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.431447 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.431468 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.431544 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.431607 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.433715 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.442411 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f57qk" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.448651 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wthxr" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.458923 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8ea236_9be9_4eb2_904a_103c4c279f28.slice/crio-8d75a7537c0d366522c15679b054d00da565113fd1bbe9658a3c0224b432e258 WatchSource:0}: Error finding container 8d75a7537c0d366522c15679b054d00da565113fd1bbe9658a3c0224b432e258: Status 404 returned error can't find the container with id 8d75a7537c0d366522c15679b054d00da565113fd1bbe9658a3c0224b432e258 Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.461004 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.469974 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a37d2a_37c5_4fbd_b10b_f5e4706772f4.slice/crio-fa7b2e6d113c68303fc5395966e17ddf9ab4214a25b220009033ec5053dc8ad2 WatchSource:0}: Error finding container fa7b2e6d113c68303fc5395966e17ddf9ab4214a25b220009033ec5053dc8ad2: Status 404 returned error can't find the container with id fa7b2e6d113c68303fc5395966e17ddf9ab4214a25b220009033ec5053dc8ad2 Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.470314 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.517811 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.555944 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.570423 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-sbjst"] Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.571078 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.577296 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qb2zq"] Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.578128 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.581021 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.585963 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.593435 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-chq65"] Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.593845 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.593885 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.593901 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.593948 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.593963 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.594013 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.594024 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.594080 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.594092 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.594141 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.594154 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.594218 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.594234 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.594381 4696 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 02 22:42:40 crc kubenswrapper[4696]: E1202 22:42:40.594404 4696 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.603229 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.603461 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.603568 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.603673 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.603803 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608163 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608221 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-system-cni-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608245 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608260 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53353260-c7c9-435c-91eb-3d5a1b441c4a-rootfs\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608292 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tfb\" (UniqueName: \"kubernetes.io/projected/53353260-c7c9-435c-91eb-3d5a1b441c4a-kube-api-access-q7tfb\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608307 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608323 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608341 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608358 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dk6j\" (UniqueName: \"kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608404 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608437 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608485 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53353260-c7c9-435c-91eb-3d5a1b441c4a-mcd-auth-proxy-config\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608518 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608534 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53353260-c7c9-435c-91eb-3d5a1b441c4a-proxy-tls\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608599 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-os-release\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608632 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608647 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-cnibin\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608666 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608686 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608702 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwcnd\" (UniqueName: \"kubernetes.io/projected/5a0d758e-6da6-4382-99f1-dd295b63eb98-kube-api-access-qwcnd\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.608721 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.635485 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.666042 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709079 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709118 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709153 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dk6j\" (UniqueName: \"kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709201 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709218 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709257 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709282 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709294 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709325 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53353260-c7c9-435c-91eb-3d5a1b441c4a-mcd-auth-proxy-config\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709490 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709506 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53353260-c7c9-435c-91eb-3d5a1b441c4a-proxy-tls\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709520 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709534 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-os-release\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709550 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709566 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-cnibin\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709582 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709600 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709620 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwcnd\" (UniqueName: \"kubernetes.io/projected/5a0d758e-6da6-4382-99f1-dd295b63eb98-kube-api-access-qwcnd\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709653 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709667 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709716 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-system-cni-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53353260-c7c9-435c-91eb-3d5a1b441c4a-rootfs\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.709765 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tfb\" (UniqueName: \"kubernetes.io/projected/53353260-c7c9-435c-91eb-3d5a1b441c4a-kube-api-access-q7tfb\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.710037 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.710752 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711146 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.710982 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711301 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.710924 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-cnibin\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711052 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711397 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711426 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711469 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0d758e-6da6-4382-99f1-dd295b63eb98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711495 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711499 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711523 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-os-release\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711551 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711567 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711583 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0d758e-6da6-4382-99f1-dd295b63eb98-system-cni-dir\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.711619 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53353260-c7c9-435c-91eb-3d5a1b441c4a-rootfs\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.712005 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53353260-c7c9-435c-91eb-3d5a1b441c4a-mcd-auth-proxy-config\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.715289 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53353260-c7c9-435c-91eb-3d5a1b441c4a-proxy-tls\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.721726 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.729142 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerStarted","Data":"c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d"} Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.729189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerStarted","Data":"fa7b2e6d113c68303fc5395966e17ddf9ab4214a25b220009033ec5053dc8ad2"} Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.730415 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f57qk" event={"ID":"cf8ea236-9be9-4eb2-904a-103c4c279f28","Type":"ContainerStarted","Data":"8d75a7537c0d366522c15679b054d00da565113fd1bbe9658a3c0224b432e258"} Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.740083 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwcnd\" (UniqueName: \"kubernetes.io/projected/5a0d758e-6da6-4382-99f1-dd295b63eb98-kube-api-access-qwcnd\") pod \"multus-additional-cni-plugins-sbjst\" (UID: \"5a0d758e-6da6-4382-99f1-dd295b63eb98\") " pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.741304 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tfb\" (UniqueName: \"kubernetes.io/projected/53353260-c7c9-435c-91eb-3d5a1b441c4a-kube-api-access-q7tfb\") pod \"machine-config-daemon-chq65\" (UID: \"53353260-c7c9-435c-91eb-3d5a1b441c4a\") " pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.742773 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.764941 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.783081 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.800673 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.817633 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.839020 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.881528 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.894392 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sbjst" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.905788 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.906599 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0d758e_6da6_4382_99f1_dd295b63eb98.slice/crio-f4332c2b9b518ffed13e65c94e8d9dafcf46dbcefc5c6794bbb8ffe8afe0fc1b WatchSource:0}: Error finding container f4332c2b9b518ffed13e65c94e8d9dafcf46dbcefc5c6794bbb8ffe8afe0fc1b: Status 404 returned error can't find the container with id f4332c2b9b518ffed13e65c94e8d9dafcf46dbcefc5c6794bbb8ffe8afe0fc1b Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.929022 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.932376 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.942905 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.956952 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: I1202 22:42:40.968620 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:40Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:40 crc kubenswrapper[4696]: W1202 22:42:40.995055 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53353260_c7c9_435c_91eb_3d5a1b441c4a.slice/crio-4addef9fc9ad05a95357fad3587edf474fcc83bc3797ba0a7875940d48a5e3b7 WatchSource:0}: Error finding container 4addef9fc9ad05a95357fad3587edf474fcc83bc3797ba0a7875940d48a5e3b7: Status 404 returned error can't find the container with id 4addef9fc9ad05a95357fad3587edf474fcc83bc3797ba0a7875940d48a5e3b7 Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.002767 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.018664 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.040907 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.065185 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.492908 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.524576 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.568772 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.572615 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.614351 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.626148 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dk6j\" (UniqueName: \"kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.629163 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.631643 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:41 crc kubenswrapper[4696]: E1202 22:42:41.711201 4696 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 22:42:41 crc kubenswrapper[4696]: E1202 22:42:41.711569 4696 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 22:42:41 crc kubenswrapper[4696]: E1202 22:42:41.711691 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert podName:c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.211657242 +0000 UTC m=+25.092337243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert") pod "ovnkube-node-qb2zq" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b") : failed to sync secret cache: timed out waiting for the condition Dec 02 22:42:41 crc kubenswrapper[4696]: E1202 22:42:41.711947 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config podName:c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:42.21193084 +0000 UTC m=+25.092610841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config") pod "ovnkube-node-qb2zq" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b") : failed to sync configmap cache: timed out waiting for the condition Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.736883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.740043 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76" exitCode=0 Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.740143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.740176 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerStarted","Data":"f4332c2b9b518ffed13e65c94e8d9dafcf46dbcefc5c6794bbb8ffe8afe0fc1b"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.742106 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f57qk" event={"ID":"cf8ea236-9be9-4eb2-904a-103c4c279f28","Type":"ContainerStarted","Data":"3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.746461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.746509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.746520 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"4addef9fc9ad05a95357fad3587edf474fcc83bc3797ba0a7875940d48a5e3b7"} Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.750186 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.761855 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.769258 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.787216 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.806184 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.822010 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.835002 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.883100 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.960689 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.977148 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:41 crc kubenswrapper[4696]: I1202 22:42:41.989786 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.004424 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.018209 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.026140 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.030061 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.059088 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.079800 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.094239 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.108282 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.118812 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.126409 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.126682 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:46.1266619 +0000 UTC m=+29.007341901 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.134014 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.149041 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.169553 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.182097 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.196176 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.209537 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.224577 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227428 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227465 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227502 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227525 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.227580 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.227694 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.227784 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:46.227767615 +0000 UTC m=+29.108447616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228139 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228182 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:46.228172357 +0000 UTC m=+29.108852358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228269 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228295 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228352 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228377 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228470 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:46.228440794 +0000 UTC m=+29.109120835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228305 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228518 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.228563 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:46.228550527 +0000 UTC m=+29.109230558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.228974 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.237451 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.238889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-qb2zq\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.258417 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.280152 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.417549 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.431149 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.431200 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.431347 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.431460 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.431643 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:42 crc kubenswrapper[4696]: E1202 22:42:42.431829 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:42 crc kubenswrapper[4696]: W1202 22:42:42.473181 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9feb35d_e44f_4ebe_a363_a7bffc6d1f3b.slice/crio-81fc9d2c388d599b51bca4c279af177a4e77c75ce58bdaccdc7ce2e7343a4c7f WatchSource:0}: Error finding container 81fc9d2c388d599b51bca4c279af177a4e77c75ce58bdaccdc7ce2e7343a4c7f: Status 404 returned error can't find the container with id 81fc9d2c388d599b51bca4c279af177a4e77c75ce58bdaccdc7ce2e7343a4c7f Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.753295 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1" exitCode=0 Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.753405 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1"} Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.756839 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" exitCode=0 Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.756883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.756933 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"81fc9d2c388d599b51bca4c279af177a4e77c75ce58bdaccdc7ce2e7343a4c7f"} Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.772001 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.799451 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.813131 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.827993 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.847896 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.861536 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.881122 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.893479 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.920297 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.931728 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rgk7n"] Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.932211 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.941926 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.941965 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.942380 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.942598 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.949077 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.967282 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.984475 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:42 crc kubenswrapper[4696]: I1202 22:42:42.999785 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:42Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.023179 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.040398 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.042864 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd8x\" (UniqueName: \"kubernetes.io/projected/07db09ff-2489-4357-af38-aca9655ac1d7-kube-api-access-fdd8x\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.042939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07db09ff-2489-4357-af38-aca9655ac1d7-serviceca\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.042977 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07db09ff-2489-4357-af38-aca9655ac1d7-host\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.071409 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.087325 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.098455 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.116845 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.130830 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.143871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd8x\" (UniqueName: \"kubernetes.io/projected/07db09ff-2489-4357-af38-aca9655ac1d7-kube-api-access-fdd8x\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.143931 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07db09ff-2489-4357-af38-aca9655ac1d7-serviceca\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.143958 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07db09ff-2489-4357-af38-aca9655ac1d7-host\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.144041 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07db09ff-2489-4357-af38-aca9655ac1d7-host\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.144166 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.145025 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07db09ff-2489-4357-af38-aca9655ac1d7-serviceca\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.164172 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.164913 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd8x\" (UniqueName: \"kubernetes.io/projected/07db09ff-2489-4357-af38-aca9655ac1d7-kube-api-access-fdd8x\") pod \"node-ca-rgk7n\" (UID: \"07db09ff-2489-4357-af38-aca9655ac1d7\") " pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.195507 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.215393 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.229203 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.248498 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.253528 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rgk7n" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.264870 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: W1202 22:42:43.267346 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07db09ff_2489_4357_af38_aca9655ac1d7.slice/crio-bc1e78377af0a9422b1270073eb5eb564cf0d940fff59cf2d0d0b0ca254dac65 WatchSource:0}: Error finding container bc1e78377af0a9422b1270073eb5eb564cf0d940fff59cf2d0d0b0ca254dac65: Status 404 returned error can't find the container with id bc1e78377af0a9422b1270073eb5eb564cf0d940fff59cf2d0d0b0ca254dac65 Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.289615 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.310180 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.765268 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b" exitCode=0 Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.765367 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.769489 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rgk7n" event={"ID":"07db09ff-2489-4357-af38-aca9655ac1d7","Type":"ContainerStarted","Data":"9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.769534 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rgk7n" event={"ID":"07db09ff-2489-4357-af38-aca9655ac1d7","Type":"ContainerStarted","Data":"bc1e78377af0a9422b1270073eb5eb564cf0d940fff59cf2d0d0b0ca254dac65"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.815495 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816112 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816276 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816297 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816347 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.816371 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.864517 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.877482 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.890381 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.905356 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.921192 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.948649 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.965792 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:43 crc kubenswrapper[4696]: I1202 22:42:43.987595 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:43Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.005077 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.022370 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.042006 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.054447 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.071590 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.094762 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.108719 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.125591 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.137956 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.151149 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.166071 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.182813 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.203305 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.217474 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.231974 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.249380 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.259629 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.282944 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.296345 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.314059 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.327261 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.360357 4696 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.363266 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.363330 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.363350 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.363445 4696 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.377604 4696 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.378107 4696 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.379705 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.379790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.379808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.379834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.379850 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.396791 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.401780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.401838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.401854 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.401875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.401888 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.416205 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.420513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.420563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.420578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.420600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.420615 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.430668 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.430674 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.430827 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.430860 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.430980 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.431177 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.435615 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.440081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.440133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.440152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.440177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.440193 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.458690 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.464021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.464084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.464106 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.464135 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.464156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.479703 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: E1202 22:42:44.479969 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.482906 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.482988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.483006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.483068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.483086 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.587142 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.587231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.587244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.587263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.587277 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.692485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.692979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.693197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.693342 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.693476 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.797591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.797663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.797683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.797711 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.797730 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.824003 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae" exitCode=0 Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.824096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.862540 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.886647 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.902735 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.902851 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.902875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.902911 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.902939 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:44Z","lastTransitionTime":"2025-12-02T22:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.905074 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.924203 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.939878 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.967866 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:44 crc kubenswrapper[4696]: I1202 22:42:44.988585 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:44Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006815 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006823 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006848 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.006888 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.019388 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.031839 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.061812 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.074317 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.101425 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.109977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.110074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.110099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.110132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.110153 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.119297 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.147222 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.212988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.213029 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.213041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.213061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.213073 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.316395 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.316465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.316485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.316513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.316533 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.419438 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.419541 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.419559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.419591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.419610 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.523484 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.523578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.523598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.523624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.523643 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.626322 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.626371 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.626384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.626405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.626420 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.729093 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.729149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.729158 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.729177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.729190 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.831413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.831478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.831501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.831531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.831552 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.834098 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8" exitCode=0 Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.834174 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.839473 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.857725 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.877420 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.891215 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.910544 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.924929 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.933869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.933904 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.933917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.933937 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.933950 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:45Z","lastTransitionTime":"2025-12-02T22:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.951872 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.968610 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:45 crc kubenswrapper[4696]: I1202 22:42:45.984072 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.002948 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.017723 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.037979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.038027 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.038040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.038063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.038077 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.038411 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.053592 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.074967 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.097779 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.117841 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.140907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.140941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.140952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.140972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.140986 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.208561 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.208897 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:42:54.208849629 +0000 UTC m=+37.089529640 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.244299 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.244363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.244383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.244411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.244438 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.310260 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.310350 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.310411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.310461 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310584 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310644 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310661 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310693 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310774 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:54.310727326 +0000 UTC m=+37.191407527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310786 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310838 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:54.310802358 +0000 UTC m=+37.191482529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310843 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310918 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310942 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.310994 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:54.310950392 +0000 UTC m=+37.191630543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.311044 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:54.311013724 +0000 UTC m=+37.191693925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.350452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.350508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.350520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.350543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.350556 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.431301 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.431420 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.431515 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.431647 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.431851 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:46 crc kubenswrapper[4696]: E1202 22:42:46.431988 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.454152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.454207 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.454221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.454247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.454264 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.557934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.557989 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.558001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.558026 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.558041 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.662143 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.662223 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.662247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.662275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.662293 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.766198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.766277 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.766301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.766333 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.766356 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.848908 4696 generic.go:334] "Generic (PLEG): container finished" podID="5a0d758e-6da6-4382-99f1-dd295b63eb98" containerID="f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd" exitCode=0 Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.848989 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerDied","Data":"f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.869830 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.869894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.869912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.869939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.869957 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.882508 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.903113 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.922138 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.944978 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.967280 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.973368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.973497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.973522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.973553 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.973580 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:46Z","lastTransitionTime":"2025-12-02T22:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:46 crc kubenswrapper[4696]: I1202 22:42:46.993150 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.013349 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.031702 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.044790 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.058119 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.078246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.078400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.078411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.078428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.078439 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.080421 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.098925 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.127180 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.144419 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.163178 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.181595 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.181631 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.181639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.181655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.181667 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.280160 4696 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.318611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.318693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.318718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.318790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.318818 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.421601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.421633 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.421643 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.421658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.421668 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.446287 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.465332 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.478719 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.489295 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.502065 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.514625 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.524937 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.524986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.525000 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.525022 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.525036 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.525607 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.538027 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.557959 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.576805 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.592469 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.604406 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.616118 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.627724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.627760 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.627809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.627822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.627840 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.628198 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.645320 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.731135 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.731201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.731218 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.731245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.731272 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.834820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.834886 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.834901 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.834927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.834940 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.938701 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.939836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.940333 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.940587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:47 crc kubenswrapper[4696]: I1202 22:42:47.940815 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:47Z","lastTransitionTime":"2025-12-02T22:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.043689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.043715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.043723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.043774 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.043786 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.146426 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.146468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.146477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.146494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.146505 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.249838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.249898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.249910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.249931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.249946 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.353918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.353994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.354006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.354027 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.354039 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.430669 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:48 crc kubenswrapper[4696]: E1202 22:42:48.430859 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.431275 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:48 crc kubenswrapper[4696]: E1202 22:42:48.431369 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.431608 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:48 crc kubenswrapper[4696]: E1202 22:42:48.431869 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.456796 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.456842 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.456852 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.456870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.456882 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.559793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.559837 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.559846 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.559864 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.559874 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.662939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.662990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.663001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.663023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.663035 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.765681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.765734 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.765768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.765829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.765844 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.863214 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" event={"ID":"5a0d758e-6da6-4382-99f1-dd295b63eb98","Type":"ContainerStarted","Data":"5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.868908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.868962 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.868982 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.869034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.869057 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.871483 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.872263 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.872797 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.886554 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.951256 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.951349 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.954640 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.971859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.971895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.971908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.971924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.971946 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:48Z","lastTransitionTime":"2025-12-02T22:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.978436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:48 crc kubenswrapper[4696]: I1202 22:42:48.993112 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.013421 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.031510 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.045504 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.068380 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.074931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.074996 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.075024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.075060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.075087 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.091935 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.110571 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.131008 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.155461 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.178518 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.178568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.178579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.178599 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.178611 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.190404 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.214635 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.233601 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.258193 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.278269 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.282143 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.282213 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.282242 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.282284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.282310 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.299998 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.318954 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.354042 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.381672 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.385197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.385252 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.385269 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.385293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.385307 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.399568 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.414538 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.431729 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.446960 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.478111 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.488538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.488606 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.488621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.488646 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.488665 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.494981 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.516629 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.533942 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.555959 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.592494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.592561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.592581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.592617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.592638 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.694997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.695060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.695079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.695104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.695125 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.798571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.798645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.798660 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.798689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.798708 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.875144 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.902070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.902553 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.902578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.902616 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:49 crc kubenswrapper[4696]: I1202 22:42:49.902646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:49Z","lastTransitionTime":"2025-12-02T22:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.005880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.005955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.005976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.006009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.006037 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.109804 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.109869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.109883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.109913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.109928 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.213200 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.213262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.213273 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.213292 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.213305 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.316167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.316231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.316243 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.316264 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.316278 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.419566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.419607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.419620 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.419639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.419652 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.431565 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:50 crc kubenswrapper[4696]: E1202 22:42:50.431690 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.432248 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:50 crc kubenswrapper[4696]: E1202 22:42:50.432327 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.432379 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:50 crc kubenswrapper[4696]: E1202 22:42:50.432432 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.523875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.523977 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.524003 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.524038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.524063 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.628502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.628576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.628594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.628623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.628643 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.731781 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.731840 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.731863 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.731894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.731917 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.835880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.835939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.835958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.835985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.836004 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.879014 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.938992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.939056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.939078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.939116 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:50 crc kubenswrapper[4696]: I1202 22:42:50.939147 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:50Z","lastTransitionTime":"2025-12-02T22:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.043420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.043510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.043538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.043571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.043595 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.147472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.147568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.147592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.147626 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.147653 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.251403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.251477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.251500 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.251535 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.251560 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.354502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.354545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.354558 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.354581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.354595 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.458397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.458491 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.458514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.459080 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.459321 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.563221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.563289 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.563309 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.563343 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.563363 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.667134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.667209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.667231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.667263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.667285 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.771078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.771146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.771170 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.771201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.771218 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.874476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.874564 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.874593 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.874627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.874649 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.977948 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.978038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.978062 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.978596 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:51 crc kubenswrapper[4696]: I1202 22:42:51.978930 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:51Z","lastTransitionTime":"2025-12-02T22:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.084068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.084141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.084152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.084172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.084183 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.187824 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.187897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.187916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.187949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.187969 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.292909 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.292979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.292997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.293028 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.293048 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.396616 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.396692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.396705 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.396729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.396766 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.431034 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:52 crc kubenswrapper[4696]: E1202 22:42:52.431199 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.431256 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.431410 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:52 crc kubenswrapper[4696]: E1202 22:42:52.431500 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:52 crc kubenswrapper[4696]: E1202 22:42:52.431685 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.498986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.499029 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.499039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.499059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.499070 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.545834 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.601498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.601547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.601559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.601580 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.601598 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.705214 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.705270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.705289 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.705318 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.705340 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.808515 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.808638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.808659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.808688 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.808707 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.889562 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/0.log" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.895242 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c" exitCode=1 Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.895355 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.899651 4696 scope.go:117] "RemoveContainer" containerID="bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.911986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.912020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.912030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.912046 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.912056 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:52Z","lastTransitionTime":"2025-12-02T22:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.934971 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:52Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.958423 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:52Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.978871 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:52Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:52 crc kubenswrapper[4696]: I1202 22:42:52.993367 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:52Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.013707 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.015354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.015407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.015427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.015455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.015475 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.032894 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.068025 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.090430 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.117984 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.119893 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.119959 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.119979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.120006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.120025 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.136311 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.151636 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.174617 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.196806 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.216806 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.223237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.223301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.223322 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.223356 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.223382 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.236306 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.327702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.327838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.327859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.327889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.327911 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.431072 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.431172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.431200 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.431242 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.431275 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.535686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.535797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.535826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.535859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.535880 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.640017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.640091 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.640112 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.640140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.640158 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.744391 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.744464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.744483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.744514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.744535 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.778722 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl"] Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.779332 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.783956 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.784236 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.806573 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.826586 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.855622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.855848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.855869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.855898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.855917 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.861327 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.881863 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.896312 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.904635 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/0.log" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.909652 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.910311 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.912392 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.912554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.912645 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjwz\" (UniqueName: \"kubernetes.io/projected/ca48946e-a7e0-4729-8b02-b223a96990c6-kube-api-access-zkjwz\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.912683 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca48946e-a7e0-4729-8b02-b223a96990c6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.920698 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.947994 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.959206 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.959274 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.959294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.959327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.959349 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:53Z","lastTransitionTime":"2025-12-02T22:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.967090 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:53 crc kubenswrapper[4696]: I1202 22:42:53.982674 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:53Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.004838 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.013207 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.013263 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjwz\" (UniqueName: \"kubernetes.io/projected/ca48946e-a7e0-4729-8b02-b223a96990c6-kube-api-access-zkjwz\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.013285 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca48946e-a7e0-4729-8b02-b223a96990c6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.013303 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.014697 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.015360 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca48946e-a7e0-4729-8b02-b223a96990c6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.021660 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.024537 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca48946e-a7e0-4729-8b02-b223a96990c6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.038413 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.038625 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjwz\" (UniqueName: \"kubernetes.io/projected/ca48946e-a7e0-4729-8b02-b223a96990c6-kube-api-access-zkjwz\") pod \"ovnkube-control-plane-749d76644c-ttxcl\" (UID: \"ca48946e-a7e0-4729-8b02-b223a96990c6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.056140 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.061974 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.062038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.062052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.062077 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.062093 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.070383 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.091992 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.096925 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" Dec 02 22:42:54 crc kubenswrapper[4696]: W1202 22:42:54.112111 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca48946e_a7e0_4729_8b02_b223a96990c6.slice/crio-f616dc720b045fbbf45cfa46fe123db0a8ce5b92d20fdc38adf7f3e6da808f49 WatchSource:0}: Error finding container f616dc720b045fbbf45cfa46fe123db0a8ce5b92d20fdc38adf7f3e6da808f49: Status 404 returned error can't find the container with id f616dc720b045fbbf45cfa46fe123db0a8ce5b92d20fdc38adf7f3e6da808f49 Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.112594 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.130496 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.151160 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.165729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.165802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.165820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.165842 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.165856 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.167031 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.186268 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.201622 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.215558 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.215709 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.215683008 +0000 UTC m=+53.096362999 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.226658 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.244441 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.260735 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.270024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.270079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.270094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.270116 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.270141 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.277391 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.302955 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.316838 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.316876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.316900 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.316922 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317007 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317054 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.31704059 +0000 UTC m=+53.197720591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317343 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317363 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317374 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317401 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.31739237 +0000 UTC m=+53.198072371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317450 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317471 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.317465662 +0000 UTC m=+53.198145663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317512 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317523 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317532 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.317554 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.317547725 +0000 UTC m=+53.198227726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.319414 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.350895 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.373149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.373235 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.373246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.373265 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.373289 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.382186 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.399643 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.410354 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.419658 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.431007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.431068 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.431025 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.431200 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.431274 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.431348 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.476576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.476645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.476662 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.476697 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.476714 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.554318 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q9bfc"] Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.555159 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.555268 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.575694 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.580459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.580517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.580534 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.580559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.580578 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.594995 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.620837 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6sf\" (UniqueName: \"kubernetes.io/projected/ad00195c-ef4c-4d9b-941c-d01ebc498593-kube-api-access-zz6sf\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.621021 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.624910 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.639070 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.656329 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.672651 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.683254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.683306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.684117 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.684138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.684178 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.686105 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.697004 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.709354 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.722181 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6sf\" (UniqueName: \"kubernetes.io/projected/ad00195c-ef4c-4d9b-941c-d01ebc498593-kube-api-access-zz6sf\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.722312 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.722457 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.722535 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:55.222515261 +0000 UTC m=+38.103195272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.724673 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.737829 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.739460 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6sf\" (UniqueName: \"kubernetes.io/projected/ad00195c-ef4c-4d9b-941c-d01ebc498593-kube-api-access-zz6sf\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.749886 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.749930 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.749942 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.749963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.749980 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.752211 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.763396 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.767692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.767731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.767757 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.767779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.767793 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.770441 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.780946 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.785200 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.785283 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.785306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.785336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.785383 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.790399 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.800784 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.805248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.805342 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.805368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.805406 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.805432 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.807430 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.824413 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.826671 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.829341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.829376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.829390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.829410 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.829423 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.844461 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.847639 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:54Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:54 crc kubenswrapper[4696]: E1202 22:42:54.847892 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.850669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.850726 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.850753 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.850774 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.850787 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.915934 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" event={"ID":"ca48946e-a7e0-4729-8b02-b223a96990c6","Type":"ContainerStarted","Data":"f616dc720b045fbbf45cfa46fe123db0a8ce5b92d20fdc38adf7f3e6da808f49"} Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.953255 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.953280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.953289 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.953302 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:54 crc kubenswrapper[4696]: I1202 22:42:54.953312 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:54Z","lastTransitionTime":"2025-12-02T22:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.056635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.056770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.056801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.056840 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.056866 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.160812 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.160881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.160903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.160938 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.160964 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.229623 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:55 crc kubenswrapper[4696]: E1202 22:42:55.229910 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:55 crc kubenswrapper[4696]: E1202 22:42:55.230022 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:56.229995193 +0000 UTC m=+39.110675224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.264999 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.265056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.265073 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.265100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.265119 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.368037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.368114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.368136 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.368165 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.368185 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.470607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.470662 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.470675 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.470699 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.470713 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.573381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.573428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.573439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.573462 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.573476 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.676622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.676669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.676679 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.676698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.676707 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.779713 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.779823 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.779846 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.779870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.779885 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.883443 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.883495 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.883506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.883531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.883546 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.922895 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" event={"ID":"ca48946e-a7e0-4729-8b02-b223a96990c6","Type":"ContainerStarted","Data":"6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.924068 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" event={"ID":"ca48946e-a7e0-4729-8b02-b223a96990c6","Type":"ContainerStarted","Data":"55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.924993 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/1.log" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.925607 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/0.log" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.928111 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b" exitCode=1 Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.928178 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b"} Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.928227 4696 scope.go:117] "RemoveContainer" containerID="bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.929481 4696 scope.go:117] "RemoveContainer" containerID="4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b" Dec 02 22:42:55 crc kubenswrapper[4696]: E1202 22:42:55.929804 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.950928 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:55Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.975345 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:55Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.987434 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.987506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.987525 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.987561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:55 crc kubenswrapper[4696]: I1202 22:42:55.987583 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:55Z","lastTransitionTime":"2025-12-02T22:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.000980 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:55Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.022791 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.026534 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.048907 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.072647 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.090869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.090912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.090923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.090943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.090955 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.093440 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.109284 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.142415 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.157566 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.177272 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.194218 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.194316 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.194340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.194375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.194399 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.196274 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.212069 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.230335 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.240852 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.241153 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.241345 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:42:58.241302703 +0000 UTC m=+41.121982874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.257073 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.273844 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.290122 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.297684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.297793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.297814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.297858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.297898 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.308201 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.328500 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.352906 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.369868 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.388121 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.400735 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.400802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.400815 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.400833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.400847 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.412528 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.431491 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.431528 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.431506 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.431633 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.431532 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.432241 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.432342 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.432367 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: E1202 22:42:56.432108 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.452211 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.475039 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.495011 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.504041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.504078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.504089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.504110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.504123 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.512432 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.533734 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.552736 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.591487 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.607456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.607517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.607535 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.607561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.607580 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.620441 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.639023 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.655179 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.710183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.710208 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.710219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.710237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.710248 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.813403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.813468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.813485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.813510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.813530 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.916697 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.917130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.917319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.917445 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.917584 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:56Z","lastTransitionTime":"2025-12-02T22:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:56 crc kubenswrapper[4696]: I1202 22:42:56.933994 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/1.log" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.020459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.020511 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.020521 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.020540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.020556 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.124105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.124194 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.124214 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.124251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.124276 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.227148 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.227180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.227189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.227205 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.227215 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.331280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.331401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.331423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.331450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.331464 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.435547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.435664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.435687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.435719 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.435773 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.474964 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.493158 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.516475 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.539847 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.539990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.540061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.540081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.540113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.540136 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.568465 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.588001 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.605188 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.628205 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.642950 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.643023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.643041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.643074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.643097 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.648374 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.671576 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.691525 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.712469 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.730967 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.746012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.746070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.746088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.746112 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.746131 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.750591 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.810175 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.851390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.851505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.851528 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.851596 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.851622 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.852176 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.877714 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:42:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.954803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.954864 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.954883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.954909 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:57 crc kubenswrapper[4696]: I1202 22:42:57.954928 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:57Z","lastTransitionTime":"2025-12-02T22:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.058626 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.058686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.058699 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.058718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.058734 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.163079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.163133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.163149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.163175 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.163192 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.265068 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.265412 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.265501 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:02.26547006 +0000 UTC m=+45.146150261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.267309 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.267375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.267392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.267420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.267438 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.370301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.370370 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.370389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.370418 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.370441 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.431606 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.431662 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.431632 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.431620 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.431868 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.432043 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.432180 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:42:58 crc kubenswrapper[4696]: E1202 22:42:58.432295 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.474886 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.474959 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.474979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.475009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.475030 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.578829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.578900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.578923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.578967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.579176 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.682093 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.682199 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.682258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.682328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.682367 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.785976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.786044 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.786066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.786094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.786115 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.890106 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.890200 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.890225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.890260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.890284 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.994144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.994418 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.994465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.994498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:58 crc kubenswrapper[4696]: I1202 22:42:58.994522 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:58Z","lastTransitionTime":"2025-12-02T22:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.098173 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.098244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.098263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.098294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.098317 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.201443 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.201518 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.201538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.201565 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.201583 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.304584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.304647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.304671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.304697 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.304716 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.407965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.408042 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.408066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.408101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.408122 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.511487 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.511568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.511588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.511617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.511636 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.615215 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.615307 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.615328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.615357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.615378 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.719108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.719202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.719220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.719248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.719268 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.823282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.823358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.823376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.823407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.823426 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.927140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.927199 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.927211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.927233 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:42:59 crc kubenswrapper[4696]: I1202 22:42:59.927274 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:42:59Z","lastTransitionTime":"2025-12-02T22:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.030966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.031046 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.031071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.031139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.031166 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.134826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.134876 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.134887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.134905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.134917 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.239155 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.239229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.239252 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.239298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.239319 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.344450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.344519 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.344555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.344572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.344584 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.431432 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.431524 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.431574 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.431532 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:00 crc kubenswrapper[4696]: E1202 22:43:00.431800 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:00 crc kubenswrapper[4696]: E1202 22:43:00.431892 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:00 crc kubenswrapper[4696]: E1202 22:43:00.432062 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:00 crc kubenswrapper[4696]: E1202 22:43:00.432130 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.449879 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.450110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.450246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.450392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.450525 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.554404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.554466 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.554479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.554501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.554514 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.657567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.657631 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.657648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.657673 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.657691 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.761119 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.761369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.761498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.761603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.761705 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.865086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.865139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.865153 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.865175 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.865188 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.968151 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.968210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.968222 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.968242 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:00 crc kubenswrapper[4696]: I1202 22:43:00.968253 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:00Z","lastTransitionTime":"2025-12-02T22:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.071870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.071958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.071990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.072026 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.072050 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.176656 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.176716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.176731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.176780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.176797 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.280211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.280293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.280314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.280351 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.280372 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.384035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.384132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.384161 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.384197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.384226 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.488215 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.488278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.488295 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.488317 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.488336 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.591787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.591852 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.591869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.591900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.591920 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.695326 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.695380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.695393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.695412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.695426 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.799301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.799380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.799400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.799429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.799452 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.903126 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.903193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.903212 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.903240 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:01 crc kubenswrapper[4696]: I1202 22:43:01.903260 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:01Z","lastTransitionTime":"2025-12-02T22:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.007613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.007717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.007764 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.007797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.007825 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.111196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.111278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.111295 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.111323 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.111342 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.215365 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.215446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.215467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.215498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.215520 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.316195 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.316402 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.316540 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:10.316515212 +0000 UTC m=+53.197195213 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.318527 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.318585 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.318603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.318634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.318651 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.422216 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.422276 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.422293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.422321 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.422339 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.431446 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.431485 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.431553 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.431582 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.431618 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.431863 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.431900 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:02 crc kubenswrapper[4696]: E1202 22:43:02.432070 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.524501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.524561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.524574 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.524595 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.524611 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.627039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.627101 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.627113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.627130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.627141 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.730447 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.730524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.730547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.730595 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.730614 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.833769 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.833816 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.833827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.833845 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.833855 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.941989 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.942061 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.942080 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.942108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:02 crc kubenswrapper[4696]: I1202 22:43:02.942128 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:02Z","lastTransitionTime":"2025-12-02T22:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.045610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.045698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.045721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.045793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.045820 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.148502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.148584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.148608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.148638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.148664 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.251668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.251789 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.251833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.251857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.251871 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.355560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.355624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.355641 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.355663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.355678 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.459401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.459473 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.459492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.459520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.459540 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.562304 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.562367 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.562390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.562419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.562440 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.665426 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.665497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.665517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.665547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.665564 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.769047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.769109 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.769123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.769143 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.769155 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.872660 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.872715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.872727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.872769 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.872783 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.980171 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.980262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.980292 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.980326 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:03 crc kubenswrapper[4696]: I1202 22:43:03.980348 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:03Z","lastTransitionTime":"2025-12-02T22:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.084193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.084258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.084282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.084307 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.084326 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.187254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.187321 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.187348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.187374 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.187391 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.291105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.291167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.291180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.291198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.291214 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.394788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.394873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.394897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.394931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.394954 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.430884 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.430970 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.430904 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.431026 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:04 crc kubenswrapper[4696]: E1202 22:43:04.431110 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:04 crc kubenswrapper[4696]: E1202 22:43:04.431254 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:04 crc kubenswrapper[4696]: E1202 22:43:04.431435 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:04 crc kubenswrapper[4696]: E1202 22:43:04.431500 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.497807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.497884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.497943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.497970 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.497990 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.600996 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.601074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.601091 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.601118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.601138 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.704563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.704627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.704638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.704687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.704706 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.768307 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.789605 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.792293 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.808334 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.808374 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.808393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.808417 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.808437 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.816312 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.834135 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.850304 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.868197 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.889224 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.907789 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.912841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.912874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.912888 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.912907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.912919 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:04Z","lastTransitionTime":"2025-12-02T22:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.934414 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.952638 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:04 crc kubenswrapper[4696]: I1202 22:43:04.989763 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.006654 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.017085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.017133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.017148 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.017169 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.017185 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.020929 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.036000 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.051269 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.059148 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.059209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.059228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.059262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.059283 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.067973 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.084577 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.090513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.090588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.090615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.090654 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.090679 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.091436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.107088 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.115091 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.120837 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.120934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.120952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.120981 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.121001 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.142466 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.147690 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.147852 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.147874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.147901 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.147923 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.170145 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.174992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.175053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.175072 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.175100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.175120 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.196959 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:05 crc kubenswrapper[4696]: E1202 22:43:05.197187 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.199516 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.199580 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.199604 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.199634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.199653 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.303979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.304245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.304271 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.304309 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.304339 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.407909 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.407987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.408008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.408039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.408061 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.511806 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.511905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.511925 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.511955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.511977 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.615797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.615868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.615887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.615913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.615932 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.719021 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.719086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.719105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.719132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.719152 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.822550 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.822623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.822641 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.822667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.822688 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.926132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.926205 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.926226 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.926258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:05 crc kubenswrapper[4696]: I1202 22:43:05.926277 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:05Z","lastTransitionTime":"2025-12-02T22:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.037224 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.037301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.037323 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.037354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.037370 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.140597 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.140689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.140714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.140784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.140805 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.244244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.244332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.244351 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.244380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.244405 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.349095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.349208 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.349233 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.349268 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.349294 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.431407 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.431514 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.431534 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:06 crc kubenswrapper[4696]: E1202 22:43:06.431683 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.431867 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:06 crc kubenswrapper[4696]: E1202 22:43:06.432073 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:06 crc kubenswrapper[4696]: E1202 22:43:06.432179 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:06 crc kubenswrapper[4696]: E1202 22:43:06.432455 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.452412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.452543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.452581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.452613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.452635 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.556452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.556520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.556540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.556569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.556588 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.659978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.660103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.660123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.660152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.660173 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.764251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.764320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.764338 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.764363 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.764381 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.868178 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.868284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.868301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.868329 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.868348 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.972201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.972278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.972302 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.972341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:06 crc kubenswrapper[4696]: I1202 22:43:06.972366 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:06Z","lastTransitionTime":"2025-12-02T22:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.075498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.075568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.075593 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.075628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.075652 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.179260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.179349 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.179370 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.179403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.179424 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.282572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.282641 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.282653 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.282674 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.282688 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.385934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.386009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.386028 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.386058 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.386085 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.452264 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.474371 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.489769 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.489841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.489856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.489878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.489895 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.497254 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.526566 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcbeb403dee928a5b37ed7662c4999e027b9b9fed6b8cb704c8e41b5c68c585c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:52Z\\\",\\\"message\\\":\\\" from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554281 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:50.554443 6010 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.554709 6010 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 22:42:50.556928 6010 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 22:42:50.557016 6010 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 22:42:50.557094 6010 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:50.557153 6010 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:50.557100 6010 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 22:42:50.557332 6010 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 22:42:50.557365 6010 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:50.557430 6010 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:50.557553 6010 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.543359 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.557931 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.573589 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.592658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.592732 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.592766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.592791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.592809 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.595553 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.613203 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.628908 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.648081 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.664647 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.680783 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697537 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697556 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.697867 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.717181 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.748978 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.771856 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.783970 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.800604 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.800666 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.800688 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.800718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.800764 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.905045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.905114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.905143 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.905180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:07 crc kubenswrapper[4696]: I1202 22:43:07.905210 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:07Z","lastTransitionTime":"2025-12-02T22:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.008015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.008067 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.008086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.008115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.008132 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.111687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.111813 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.111842 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.111882 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.111909 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.215773 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.216078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.216219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.216361 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.216486 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.320512 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.320584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.320601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.320631 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.320650 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.425309 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.425664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.425899 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.426103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.426299 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.431888 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.431998 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.431903 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.431902 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:08 crc kubenswrapper[4696]: E1202 22:43:08.432102 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:08 crc kubenswrapper[4696]: E1202 22:43:08.432233 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:08 crc kubenswrapper[4696]: E1202 22:43:08.432423 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:08 crc kubenswrapper[4696]: E1202 22:43:08.432611 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.530412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.530483 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.530502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.530531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.530555 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.633950 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.634937 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.634980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.635015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.635039 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.738945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.739398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.739593 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.739786 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.739932 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.843693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.843794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.843814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.843849 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.843869 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.947501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.947555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.947581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.947611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:08 crc kubenswrapper[4696]: I1202 22:43:08.947632 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:08Z","lastTransitionTime":"2025-12-02T22:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.050884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.051182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.051457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.051635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.051811 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.154949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.155015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.155038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.155071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.155106 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.258873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.258925 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.258940 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.258962 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.258974 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.362453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.362825 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.363159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.363350 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.363566 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.466258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.466332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.466354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.466383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.466402 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.570196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.570610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.570833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.570996 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.571141 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.682114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.682193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.682214 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.682246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.682269 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.785956 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.786036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.786060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.786095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.786122 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.889889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.889986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.890012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.890041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.890060 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.993320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.993387 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.993415 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.993449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:09 crc kubenswrapper[4696]: I1202 22:43:09.993471 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:09Z","lastTransitionTime":"2025-12-02T22:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.098657 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.098724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.098778 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.098807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.098828 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.202503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.202558 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.202576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.202600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.202617 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.305794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.305874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.305898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.305931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.305954 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.313195 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.313545 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:43:42.313518394 +0000 UTC m=+85.194198435 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.408887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.408975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.409007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.409046 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.409079 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.414201 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.414247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.414274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.414299 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.414327 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414406 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414431 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414434 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414465 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:42.414445657 +0000 UTC m=+85.295125658 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414469 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414514 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414538 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414486 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:26.414476708 +0000 UTC m=+69.295156709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414595 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:42.414585381 +0000 UTC m=+85.295265392 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.414615 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:42.414605342 +0000 UTC m=+85.295285343 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.415142 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.415161 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.415173 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.415218 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:42.415208039 +0000 UTC m=+85.295888040 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.431625 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.431732 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.431805 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.431759 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.431908 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.431734 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.432006 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:10 crc kubenswrapper[4696]: E1202 22:43:10.432138 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.433266 4696 scope.go:117] "RemoveContainer" containerID="4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.470108 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.493943 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.512984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.513035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.513070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.513096 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.513114 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.513507 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.535664 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.550660 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.571041 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.590375 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.613345 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.616690 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.616798 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.616822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.616855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.616882 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.630479 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.652759 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.675993 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.693899 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.710863 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.720628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.720680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.720694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.720716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.720732 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.732219 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.754262 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.776090 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.799056 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.822453 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.824485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.824575 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.824599 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.824632 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.824659 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.928303 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.928388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.928408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.928442 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:10 crc kubenswrapper[4696]: I1202 22:43:10.928466 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:10Z","lastTransitionTime":"2025-12-02T22:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.031408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.031473 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.031492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.031517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.031535 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.135649 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.135772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.135794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.135818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.135835 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.239635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.239724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.239785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.239818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.239840 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.343467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.343553 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.343578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.343610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.343631 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.447100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.447186 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.447211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.447248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.447271 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.551494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.551570 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.551587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.551615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.551635 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.655652 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.655727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.655782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.655816 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.655837 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.759915 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.759995 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.760016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.760045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.760063 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.862887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.862973 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.862990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.863017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.863039 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.965959 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.966034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.966059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.966085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:11 crc kubenswrapper[4696]: I1202 22:43:11.966105 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:11Z","lastTransitionTime":"2025-12-02T22:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.070299 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.071002 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.071036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.071075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.071104 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.175185 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.175554 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.175579 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.175610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.175634 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.279364 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.279436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.279453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.279482 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.279504 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.382349 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.382425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.382450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.382477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.382495 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.432008 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:12 crc kubenswrapper[4696]: E1202 22:43:12.432155 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.432669 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:12 crc kubenswrapper[4696]: E1202 22:43:12.432787 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.432841 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:12 crc kubenswrapper[4696]: E1202 22:43:12.432899 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.432944 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:12 crc kubenswrapper[4696]: E1202 22:43:12.433007 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.486151 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.486198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.486210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.486233 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.486283 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.589067 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.589114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.589123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.589141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.589153 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.692801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.692878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.692896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.692924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.692944 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.795814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.795865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.795876 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.795896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.795909 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.898115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.898159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.898171 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.898188 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:12 crc kubenswrapper[4696]: I1202 22:43:12.898199 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:12Z","lastTransitionTime":"2025-12-02T22:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.001197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.001250 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.001262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.001283 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.001295 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.004466 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/1.log" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.007807 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.008333 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.028214 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.044825 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.099161 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.104266 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.104328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.104338 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.104357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.104369 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.116353 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.133922 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.155712 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.171043 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.184800 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.202723 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.207137 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.207193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.207209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.207229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.207241 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.219217 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.235801 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.265243 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.284276 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.310913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.310988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.311007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.311036 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.310876 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.311064 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.331165 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.352323 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.371897 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.385820 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.416469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.416545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.416562 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.416591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.416611 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.520524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.520617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.520648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.520686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.520711 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.624960 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.625047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.625079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.625097 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.625108 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.729148 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.729219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.729237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.729259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.729274 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.832515 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.832584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.832603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.832628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.832647 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.936696 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.936796 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.936816 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.936844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:13 crc kubenswrapper[4696]: I1202 22:43:13.936865 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:13Z","lastTransitionTime":"2025-12-02T22:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.013773 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/2.log" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.014561 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/1.log" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.017898 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" exitCode=1 Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.017946 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.018001 4696 scope.go:117] "RemoveContainer" containerID="4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.018828 4696 scope.go:117] "RemoveContainer" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" Dec 02 22:43:14 crc kubenswrapper[4696]: E1202 22:43:14.019043 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040448 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040523 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040550 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040567 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.040549 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.055584 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.070727 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.092249 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.105920 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.117223 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.132511 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.143128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.143166 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.143176 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.143193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.143203 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.145474 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.158193 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.171410 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.181832 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.201601 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.219218 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.237474 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.247012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.247052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.247065 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.247084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.247101 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.253329 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.267567 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.281808 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.309269 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb85a5789d7eb5dd8ac73a8a47b7241b37df3fae93b8ea2f241113883535f3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:42:55.757361 6138 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:42:55.757937 6138 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:42:55.757959 6138 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:42:55.757978 6138 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:42:55.757986 6138 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:42:55.757997 6138 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:42:55.758042 6138 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:42:55.758053 6138 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:42:55.758077 6138 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:42:55.758079 6138 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 22:42:55.758091 6138 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:42:55.758120 6138 factory.go:656] Stopping watch factory\\\\nI1202 22:42:55.758126 6138 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:42:55.758143 6138 ovnkube.go:599] Stopped ovnkube\\\\nI1202 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.350429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.350502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.350513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.350534 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.350547 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.431497 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.431509 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.431849 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.431971 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:14 crc kubenswrapper[4696]: E1202 22:43:14.432480 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:14 crc kubenswrapper[4696]: E1202 22:43:14.432618 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:14 crc kubenswrapper[4696]: E1202 22:43:14.432848 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:14 crc kubenswrapper[4696]: E1202 22:43:14.433031 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.453730 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.453822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.453841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.453869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.453889 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.559918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.559998 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.560017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.560060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.560100 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.664651 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.665150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.665353 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.665504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.665635 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.768779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.768853 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.768872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.768899 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.768915 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.872619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.874084 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.874231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.874393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.874606 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.978630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.978718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.978737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.978800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:14 crc kubenswrapper[4696]: I1202 22:43:14.978835 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:14Z","lastTransitionTime":"2025-12-02T22:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.025718 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/2.log" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.032827 4696 scope.go:117] "RemoveContainer" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.033368 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.052675 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.077377 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.082681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.082822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.082845 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.082871 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.082889 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.102835 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.140058 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.158852 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.178372 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.186263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.186384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.186409 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.186441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.186467 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.201458 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.229067 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.252006 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.273235 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.289641 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.289695 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.289720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.289778 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.289800 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.295989 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.315018 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.334074 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.352249 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.369498 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.392976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.393113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.393140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.393176 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.393202 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.408625 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.431999 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.451459 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.495728 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.495794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.495808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.495829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.495842 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.598630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.598824 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.598858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.598890 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.598910 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.600605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.600656 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.600672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.600694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.600710 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.625031 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.634626 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.634930 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.635086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.635393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.635542 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.657121 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.662220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.662259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.662273 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.662296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.662310 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.685939 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.691634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.691677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.691694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.691722 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.691811 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.708078 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.713605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.713655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.713669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.713693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.713712 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.734960 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:15 crc kubenswrapper[4696]: E1202 22:43:15.735689 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.738852 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.738895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.738907 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.738929 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.738945 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.844686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.844800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.844821 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.844850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.844869 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.948974 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.949042 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.949060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.949091 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:15 crc kubenswrapper[4696]: I1202 22:43:15.949112 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:15Z","lastTransitionTime":"2025-12-02T22:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.052940 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.053008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.053029 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.053059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.053081 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.157143 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.157225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.157245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.157276 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.157296 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.261232 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.261318 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.261339 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.261371 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.261398 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.365618 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.365696 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.365788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.365827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.365849 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.431417 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.431471 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.431493 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.431567 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:16 crc kubenswrapper[4696]: E1202 22:43:16.431646 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:16 crc kubenswrapper[4696]: E1202 22:43:16.431726 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:16 crc kubenswrapper[4696]: E1202 22:43:16.431862 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:16 crc kubenswrapper[4696]: E1202 22:43:16.432013 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.469290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.469393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.469420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.469454 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.469474 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.572945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.573016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.573038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.573063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.573084 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.676590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.676661 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.676682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.676709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.676726 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.780401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.780471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.780489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.780522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.780545 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.885034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.885095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.885113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.885140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.885160 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.987810 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.987859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.987871 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.987890 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:16 crc kubenswrapper[4696]: I1202 22:43:16.987902 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:16Z","lastTransitionTime":"2025-12-02T22:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.091436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.091540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.091561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.091586 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.091604 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.195160 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.195281 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.195303 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.195336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.195359 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.299174 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.299237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.299255 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.299286 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.299302 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.402420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.402795 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.402862 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.402939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.403031 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.455591 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.476951 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.494241 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.507623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.507689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.507711 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.507768 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.507791 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.512913 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.527359 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.558591 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.585068 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.601689 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.611059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.611292 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.611453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.611604 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.611776 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.621871 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.635916 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.653014 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.670388 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.701929 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.714289 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.714347 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.714359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.714376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.714388 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.721156 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.736027 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.759365 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.780052 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.796102 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:17Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.816392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.816605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.816809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.816993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.817157 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.920712 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.920821 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.920847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.920877 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:17 crc kubenswrapper[4696]: I1202 22:43:17.920899 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:17Z","lastTransitionTime":"2025-12-02T22:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.025958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.026020 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.026033 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.026055 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.026069 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.128934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.129003 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.129023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.129053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.129074 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.232005 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.232070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.232085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.232107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.232127 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.335507 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.335552 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.335564 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.335582 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.335592 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.430721 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.430831 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:18 crc kubenswrapper[4696]: E1202 22:43:18.430975 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.431014 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.431051 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:18 crc kubenswrapper[4696]: E1202 22:43:18.431105 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:18 crc kubenswrapper[4696]: E1202 22:43:18.431356 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:18 crc kubenswrapper[4696]: E1202 22:43:18.431448 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.438823 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.438963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.438987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.439014 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.439033 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.542477 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.542520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.542548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.542569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.542585 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.646961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.647034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.647052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.647082 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.647102 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.750195 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.750251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.750262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.750283 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.750299 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.855153 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.855263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.855294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.855336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.855376 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.958511 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.958669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.958694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.958723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:18 crc kubenswrapper[4696]: I1202 22:43:18.958783 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:18Z","lastTransitionTime":"2025-12-02T22:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.061916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.061984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.062002 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.062030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.062050 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.165154 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.165256 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.165280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.165313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.165337 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.269136 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.269231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.269256 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.269290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.269315 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.373282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.373320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.373328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.373342 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.373353 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.475783 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.475881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.475900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.475927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.475946 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.578986 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.579056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.579076 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.579107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.579134 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.682869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.682949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.682962 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.682989 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.683002 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.787007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.787075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.787092 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.787119 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.787138 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.890782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.890849 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.890867 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.890896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.890914 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.994032 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.994109 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.994130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.994159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:19 crc kubenswrapper[4696]: I1202 22:43:19.994180 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:19Z","lastTransitionTime":"2025-12-02T22:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.098081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.098146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.098167 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.098197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.098216 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.200922 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.200975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.200991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.201015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.201033 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.304614 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.304703 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.304715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.304816 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.305047 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.407868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.407910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.407923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.407945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.407958 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.431169 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.431234 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.431300 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.431376 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:20 crc kubenswrapper[4696]: E1202 22:43:20.431416 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:20 crc kubenswrapper[4696]: E1202 22:43:20.431594 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:20 crc kubenswrapper[4696]: E1202 22:43:20.431678 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:20 crc kubenswrapper[4696]: E1202 22:43:20.431922 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.510705 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.510779 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.510791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.510810 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.510824 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.614791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.614858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.614882 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.614914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.614936 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.717709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.717784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.717798 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.717820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.717834 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.822381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.822449 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.822462 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.822485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.822506 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.925402 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.925459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.925471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.925490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:20 crc kubenswrapper[4696]: I1202 22:43:20.925502 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:20Z","lastTransitionTime":"2025-12-02T22:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.028508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.028555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.028565 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.028584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.028595 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.131762 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.131884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.131904 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.131934 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.131952 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.235203 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.235248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.235258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.235278 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.235287 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.338303 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.338346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.338357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.338374 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.338388 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.440810 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.440856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.440873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.440887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.440900 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.543531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.543574 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.543590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.543611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.543631 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.646647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.646702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.646713 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.646733 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.646766 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.749298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.749375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.749385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.749405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.749419 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.852392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.852435 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.852447 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.852467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.852479 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.955459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.955585 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.955598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.955617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:21 crc kubenswrapper[4696]: I1202 22:43:21.955628 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:21Z","lastTransitionTime":"2025-12-02T22:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.059663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.060022 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.060075 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.060102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.060405 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.163340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.163403 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.163417 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.163440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.163453 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.266268 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.266330 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.266352 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.266378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.266395 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.370056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.370132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.370150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.370184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.370206 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.430651 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.430730 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.430889 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.431018 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:22 crc kubenswrapper[4696]: E1202 22:43:22.431000 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:22 crc kubenswrapper[4696]: E1202 22:43:22.431217 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:22 crc kubenswrapper[4696]: E1202 22:43:22.431370 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:22 crc kubenswrapper[4696]: E1202 22:43:22.431576 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.473560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.473900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.473978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.474060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.474147 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.577492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.577566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.577583 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.577608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.577624 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.680458 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.680529 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.680544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.680567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.680582 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.783917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.783966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.783975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.783993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.784005 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.886613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.886956 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.887044 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.887118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.887200 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.990577 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.990631 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.990643 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.990664 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:22 crc kubenswrapper[4696]: I1202 22:43:22.990677 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:22Z","lastTransitionTime":"2025-12-02T22:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.093180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.093227 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.093237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.093253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.093264 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.197053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.197128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.197142 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.197164 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.197199 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.300008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.300049 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.300060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.300076 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.300086 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.403154 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.403253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.403275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.403296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.403311 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.506006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.506060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.506071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.506092 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.506108 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.609321 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.609421 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.609439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.609469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.609491 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.712030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.712077 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.712088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.712107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.712119 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.815113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.815160 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.815172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.815189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.815201 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.918219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.918275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.918294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.918320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:23 crc kubenswrapper[4696]: I1202 22:43:23.918337 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:23Z","lastTransitionTime":"2025-12-02T22:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.021381 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.021447 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.021462 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.021481 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.021495 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.123500 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.123569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.123593 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.123617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.123635 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.226687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.226736 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.226770 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.226787 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.226796 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.329478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.329548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.329563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.329587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.329601 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.430643 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:24 crc kubenswrapper[4696]: E1202 22:43:24.430841 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.430962 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:24 crc kubenswrapper[4696]: E1202 22:43:24.431146 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.430982 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:24 crc kubenswrapper[4696]: E1202 22:43:24.431346 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.431571 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:24 crc kubenswrapper[4696]: E1202 22:43:24.431703 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.432791 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.432831 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.432841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.432862 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.432874 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.534984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.535035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.535047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.535066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.535080 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.638378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.638429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.638439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.638455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.638465 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.741130 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.741210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.741229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.741257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.741276 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.849096 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.849156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.849173 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.849198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.849215 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.953306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.953388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.953400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.953425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:24 crc kubenswrapper[4696]: I1202 22:43:24.953448 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:24Z","lastTransitionTime":"2025-12-02T22:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.056259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.056301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.056311 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.056328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.056340 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.159284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.159328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.159346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.159373 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.159389 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.262686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.262809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.262836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.262873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.262915 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.365757 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.365797 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.365807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.365827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.365842 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.467425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.467472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.467484 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.467504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.467517 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.570836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.570879 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.570887 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.570903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.570914 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.673724 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.673792 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.673804 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.673822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.673835 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.776601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.776645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.776660 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.776678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.776692 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.879461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.879538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.879561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.879591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.879611 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.912012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.912067 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.912081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.912103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.912135 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: E1202 22:43:25.933956 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.939453 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.939506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.939522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.939543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.939558 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: E1202 22:43:25.957221 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.961897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.962001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.962030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.962066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.962093 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:25 crc kubenswrapper[4696]: E1202 22:43:25.980190 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.986202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.986251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.986263 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.986290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:25 crc kubenswrapper[4696]: I1202 22:43:25.986304 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:25Z","lastTransitionTime":"2025-12-02T22:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.006988 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:26Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.011452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.011520 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.011538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.011565 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.011586 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.027991 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:26Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.028109 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.030105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.030164 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.030186 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.030210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.030230 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.132194 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.132230 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.132241 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.132260 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.132277 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.235189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.235248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.235268 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.235296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.235310 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.338630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.338703 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.338718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.338760 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.338775 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.431114 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.431178 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.431208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.431226 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.431340 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.431626 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.431759 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.431800 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.450234 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.450270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.450286 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.450305 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.450319 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.498619 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.498830 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:26 crc kubenswrapper[4696]: E1202 22:43:26.498906 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:43:58.498880113 +0000 UTC m=+101.379560154 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.553601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.553642 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.553657 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.553681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.553696 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.656799 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.656874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.656894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.656924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.656946 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.760677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.760832 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.760869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.760894 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.761306 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.865139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.865211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.865231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.865261 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.865314 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.968465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.968523 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.968536 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.968558 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:26 crc kubenswrapper[4696]: I1202 22:43:26.968572 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:26Z","lastTransitionTime":"2025-12-02T22:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.072255 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.072351 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.072369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.072399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.072416 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.079267 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/0.log" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.079368 4696 generic.go:334] "Generic (PLEG): container finished" podID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" containerID="c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d" exitCode=1 Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.079450 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerDied","Data":"c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.080253 4696 scope.go:117] "RemoveContainer" containerID="c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.108425 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.123257 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.160491 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.176416 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.176462 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.176504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.176524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.176537 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.178900 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.196771 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.214851 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.234801 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.255390 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.279961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.280034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.280059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.280089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.280109 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.292067 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.307436 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.321247 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.338132 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.355765 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.370790 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.383419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.383478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.383498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.383548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.383564 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.389132 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.407019 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.423570 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.440016 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.456071 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.472000 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.484661 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.485964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.486052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.486125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.486190 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.486249 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.502667 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.519751 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.557994 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.578501 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.589441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.589523 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.589543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.589572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.589594 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.596018 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.616021 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.631065 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.651218 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.668976 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.692967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.693026 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.693037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.693060 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.693072 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.705193 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.727937 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.747711 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.769573 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.785368 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.795975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.795998 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.796006 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.796023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.796035 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.797561 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:27Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.900665 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.900975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.901051 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.901123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:27 crc kubenswrapper[4696]: I1202 22:43:27.901204 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:27Z","lastTransitionTime":"2025-12-02T22:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.004639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.004689 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.004701 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.004721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.004733 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.085965 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/0.log" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.086029 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerStarted","Data":"32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.102959 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.108980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.109024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.109040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.109064 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.109082 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.121594 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.144771 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.162384 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.180053 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.198459 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212142 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212220 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212166 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.212236 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.227455 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.245716 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.264637 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.282889 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.301106 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.314561 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.314630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.314651 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.314681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.314701 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.317917 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.347765 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.368225 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.385081 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.402396 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.413287 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.417005 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.417041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.417055 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.417074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.417085 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.431594 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.431641 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.431641 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:28 crc kubenswrapper[4696]: E1202 22:43:28.431783 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:28 crc kubenswrapper[4696]: E1202 22:43:28.431865 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.431920 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:28 crc kubenswrapper[4696]: E1202 22:43:28.431978 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:28 crc kubenswrapper[4696]: E1202 22:43:28.432033 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.434717 4696 scope.go:117] "RemoveContainer" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" Dec 02 22:43:28 crc kubenswrapper[4696]: E1202 22:43:28.435943 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.519959 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.520009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.520019 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.520040 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.520054 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.623716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.623814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.623833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.623858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.623878 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.726692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.726822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.726844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.726870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.726886 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.829702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.829786 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.829805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.829829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.829850 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.954921 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.954972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.954982 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.955000 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:28 crc kubenswrapper[4696]: I1202 22:43:28.955013 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:28Z","lastTransitionTime":"2025-12-02T22:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.057508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.057568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.057581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.057602 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.057617 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.160673 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.160783 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.160808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.160834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.160902 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.265037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.265556 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.265707 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.265902 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.266080 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.369213 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.369259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.369271 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.369288 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.369300 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.471890 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.471951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.471966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.472029 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.472053 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.574781 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.574855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.574878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.574912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.574931 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.678222 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.678684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.678963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.679140 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.679309 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.782900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.782955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.782967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.782987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.783000 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.885709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.885815 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.885834 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.885861 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.885879 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.988610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.988659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.988678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.988700 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:29 crc kubenswrapper[4696]: I1202 22:43:29.988717 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:29Z","lastTransitionTime":"2025-12-02T22:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.091785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.091829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.091837 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.091853 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.091862 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.195209 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.195265 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.195282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.195307 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.195325 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.298675 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.298717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.298731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.298772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.298785 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.401517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.401601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.401620 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.401636 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.401646 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.431303 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.431346 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.431383 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.431350 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:30 crc kubenswrapper[4696]: E1202 22:43:30.431513 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:30 crc kubenswrapper[4696]: E1202 22:43:30.431662 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:30 crc kubenswrapper[4696]: E1202 22:43:30.431832 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:30 crc kubenswrapper[4696]: E1202 22:43:30.431964 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.505352 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.505450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.505469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.505498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.505518 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.608198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.608268 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.608282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.608303 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.608321 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.717275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.717319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.717329 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.717346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.717357 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.826597 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.826659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.826672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.826692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.826705 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.930862 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.930990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.931008 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.931037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:30 crc kubenswrapper[4696]: I1202 22:43:30.931058 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:30Z","lastTransitionTime":"2025-12-02T22:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.034267 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.034337 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.034358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.034385 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.034403 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.137398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.137473 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.137494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.137522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.137544 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.241869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.241976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.242071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.242185 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.242229 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.345584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.345678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.345698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.345729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.345775 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.447439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.447499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.447513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.447536 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.447554 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.550157 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.550229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.550246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.550272 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.550291 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.654312 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.654380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.654398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.654425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.654442 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.758291 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.758336 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.758346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.758367 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.758379 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.861851 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.861939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.861958 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.861991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.862013 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.965108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.965172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.965190 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.965218 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:31 crc kubenswrapper[4696]: I1202 22:43:31.965236 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:31Z","lastTransitionTime":"2025-12-02T22:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.069371 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.069451 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.069471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.069503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.069527 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.173221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.173327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.173347 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.173379 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.173403 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.278475 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.279099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.279284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.279432 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.279574 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.383972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.384018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.384029 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.384050 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.384062 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.431442 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:32 crc kubenswrapper[4696]: E1202 22:43:32.431598 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.431667 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:32 crc kubenswrapper[4696]: E1202 22:43:32.431712 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.431790 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:32 crc kubenswrapper[4696]: E1202 22:43:32.431850 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.431883 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:32 crc kubenswrapper[4696]: E1202 22:43:32.431930 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.486418 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.486482 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.486502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.486531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.486549 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.589861 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.589923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.589943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.589972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.589993 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.693406 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.693472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.693489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.693516 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.693536 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.797467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.797518 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.797531 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.797554 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.797568 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.900692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.900794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.900813 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.900835 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:32 crc kubenswrapper[4696]: I1202 22:43:32.900849 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:32Z","lastTransitionTime":"2025-12-02T22:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.004111 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.004196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.004219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.004246 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.004267 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.112901 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.112951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.112961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.112981 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.112992 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.216287 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.216341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.216358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.216383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.216401 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.320121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.320192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.320212 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.320239 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.320257 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.423964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.424041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.424056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.424081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.424096 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.527476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.527567 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.527587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.527619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.527640 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.632001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.632082 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.632107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.632141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.632167 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.734811 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.734874 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.734883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.734908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.734958 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.838238 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.838333 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.838357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.838388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.838413 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.941888 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.941949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.941967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.941993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:33 crc kubenswrapper[4696]: I1202 22:43:33.942012 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:33Z","lastTransitionTime":"2025-12-02T22:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.046394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.046517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.046537 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.046566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.046588 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.150365 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.150421 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.150434 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.150455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.150466 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.252923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.252964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.252975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.252992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.253008 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.356628 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.356705 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.356723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.356788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.356809 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.431606 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.431692 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.431716 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.431693 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:34 crc kubenswrapper[4696]: E1202 22:43:34.431835 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:34 crc kubenswrapper[4696]: E1202 22:43:34.431924 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:34 crc kubenswrapper[4696]: E1202 22:43:34.431991 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:34 crc kubenswrapper[4696]: E1202 22:43:34.432094 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.459997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.460055 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.460068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.460087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.460098 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.562612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.562652 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.562663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.562682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.562694 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.664615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.664684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.664698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.664720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.664754 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.767523 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.767594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.767615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.767649 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.767674 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.870677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.870800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.870827 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.870860 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.870881 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.975192 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.975284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.975301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.975332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:34 crc kubenswrapper[4696]: I1202 22:43:34.975350 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:34Z","lastTransitionTime":"2025-12-02T22:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.078987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.079051 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.079071 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.079102 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.079120 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.183108 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.183219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.183240 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.183270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.183291 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.286975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.287068 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.287093 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.287124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.287147 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.390897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.390960 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.390982 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.391007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.391024 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.494693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.494766 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.494778 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.494806 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.494817 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.598885 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.598964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.598983 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.599010 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.599033 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.702384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.702444 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.702452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.702466 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.702475 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.805780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.805853 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.805873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.805900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.805923 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.909438 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.909503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.909522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.909548 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:35 crc kubenswrapper[4696]: I1202 22:43:35.909568 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:35Z","lastTransitionTime":"2025-12-02T22:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.013308 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.013394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.013412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.013442 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.013460 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.117985 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.118038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.118058 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.118089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.118111 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.221680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.221792 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.221812 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.221838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.221856 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.325254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.325344 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.325377 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.325410 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.325428 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.424396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.424480 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.424508 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.424538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.424564 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.431381 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.431444 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.431410 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.431397 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.431836 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.431953 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.432217 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.432338 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.442379 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:36Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.448240 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.448285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.448297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.448313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.448325 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.476386 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:36Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.493430 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.493485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.493504 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.493530 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.493548 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.519846 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:36Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.523433 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.523455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.523463 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.523479 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.523489 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.538512 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:36Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.542235 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.542258 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.542266 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.542280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.542289 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.556917 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:36Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:36 crc kubenswrapper[4696]: E1202 22:43:36.557025 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.558251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.558274 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.558282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.558293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.558303 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.661354 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.661406 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.661419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.661439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.661452 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.764833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.764903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.764924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.764952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.764968 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.868229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.868306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.868327 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.868355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.868374 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.971280 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.971369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.971389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.971420 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:36 crc kubenswrapper[4696]: I1202 22:43:36.971440 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:36Z","lastTransitionTime":"2025-12-02T22:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.074889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.074976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.075016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.075059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.075084 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.178237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.178408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.178436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.178526 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.178552 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.282564 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.282637 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.282655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.282682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.282701 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.387500 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.387647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.387672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.387731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.387802 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.456579 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.477430 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.493253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.493369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.493392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.493427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.493452 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.503162 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.525407 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.544141 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.566294 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.585694 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.596838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.596912 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.596931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.596961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.596983 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.621372 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.645430 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.666902 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.685730 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.700514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.700572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.700592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.700619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.700639 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.709088 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.733291 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.763049 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.785075 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.802533 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.804424 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.804506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.804529 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.804562 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.804585 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.826780 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.852036 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:37Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.907869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.907944 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.907966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.907995 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:37 crc kubenswrapper[4696]: I1202 22:43:37.908018 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:37Z","lastTransitionTime":"2025-12-02T22:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.011553 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.011600 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.011610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.011631 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.011644 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.115011 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.115095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.115115 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.115152 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.115174 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.218878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.218963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.218989 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.219024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.219051 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.322581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.322674 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.322692 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.322723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.322790 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.426708 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.426830 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.426856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.426883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.426903 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.431024 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.431085 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.431239 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:38 crc kubenswrapper[4696]: E1202 22:43:38.431247 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.431292 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:38 crc kubenswrapper[4696]: E1202 22:43:38.431428 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:38 crc kubenswrapper[4696]: E1202 22:43:38.431543 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:38 crc kubenswrapper[4696]: E1202 22:43:38.431650 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.531545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.531616 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.531638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.531668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.531690 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.634944 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.635015 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.635037 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.635074 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.635093 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.739875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.739994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.740023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.740065 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.740107 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.845725 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.845848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.845866 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.845896 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.845918 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.949465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.949539 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.949556 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.949584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:38 crc kubenswrapper[4696]: I1202 22:43:38.949602 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:38Z","lastTransitionTime":"2025-12-02T22:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.053641 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.053708 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.053732 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.053788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.053808 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.156499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.156550 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.156560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.156578 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.156588 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.259835 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.259908 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.259928 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.259963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.259984 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.364118 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.364547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.364786 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.364949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.365120 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.468849 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.468926 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.468945 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.468974 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.468994 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.571771 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.571854 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.571879 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.571910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.571934 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.675348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.675436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.675460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.675492 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.675516 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.778872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.778953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.778972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.779007 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.779027 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.883826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.883905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.883928 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.883965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.883993 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.987646 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.987707 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.987726 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.987794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:39 crc kubenswrapper[4696]: I1202 22:43:39.987816 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:39Z","lastTransitionTime":"2025-12-02T22:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.092044 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.092105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.092124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.092150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.092170 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.195393 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.195475 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.195506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.195541 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.195558 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.299512 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.299612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.299625 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.299665 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.299677 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.402993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.403089 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.403107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.403144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.403171 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.431309 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.431342 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.431458 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.431557 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:40 crc kubenswrapper[4696]: E1202 22:43:40.431641 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:40 crc kubenswrapper[4696]: E1202 22:43:40.432574 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:40 crc kubenswrapper[4696]: E1202 22:43:40.432782 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:40 crc kubenswrapper[4696]: E1202 22:43:40.433035 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.448765 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.506159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.506237 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.506261 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.506292 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.506314 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.610429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.610952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.611127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.611298 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.611484 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.715839 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.715913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.715959 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.715994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.716018 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.819700 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.819792 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.819829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.819855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.819869 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.923243 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.923346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.923380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.923421 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:40 crc kubenswrapper[4696]: I1202 22:43:40.923448 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:40Z","lastTransitionTime":"2025-12-02T22:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.027291 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.027392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.027413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.027441 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.027460 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.131329 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.131413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.131436 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.131465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.131486 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.235010 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.235100 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.235124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.235158 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.235180 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.338862 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.338966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.338997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.339038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.339064 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.441697 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.441802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.441838 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.441870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.441887 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.545222 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.545299 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.545317 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.545346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.545367 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.649110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.649188 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.649208 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.649238 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.649261 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.752645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.752712 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.752729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.752790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.752809 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.856378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.856443 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.856461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.856487 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.856508 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.959868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.959950 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.959979 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.960014 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:41 crc kubenswrapper[4696]: I1202 22:43:41.960039 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:41Z","lastTransitionTime":"2025-12-02T22:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.064107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.064217 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.064247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.064279 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.064304 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.167588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.167679 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.167706 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.167736 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.167791 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.270923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.270998 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.271027 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.271059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.271079 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.375083 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.375149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.375180 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.375210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.375235 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.412809 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.413052 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:46.413019554 +0000 UTC m=+149.293699585 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.431290 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.431340 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.431379 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.431395 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.431466 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.431647 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.431791 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.431904 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.478566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.478635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.478655 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.478686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.478706 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.514400 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.514468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.514501 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.514543 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514663 4696 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514810 4696 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514892 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514939 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514825 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:44:46.514804539 +0000 UTC m=+149.395484550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514954 4696 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.514998 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.515077 4696 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.515006 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:44:46.514964984 +0000 UTC m=+149.395645015 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.515102 4696 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.515163 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:44:46.515122108 +0000 UTC m=+149.395802119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:42 crc kubenswrapper[4696]: E1202 22:43:42.515286 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:44:46.515254692 +0000 UTC m=+149.395934723 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.582533 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.582587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.582608 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.582651 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.582672 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.687588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.687645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.687659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.687682 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.687695 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.791460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.791532 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.791555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.791584 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.791609 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.894517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.894572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.894588 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.894614 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.894632 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.998133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.998623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.998644 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.998677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:42 crc kubenswrapper[4696]: I1202 22:43:42.998701 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:42Z","lastTransitionTime":"2025-12-02T22:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.102286 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.102357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.102375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.102405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.102424 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.205601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.205667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.205684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.205713 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.205731 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.308914 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.308987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.309004 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.309033 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.309050 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.412546 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.412613 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.412629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.412657 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.412675 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.433129 4696 scope.go:117] "RemoveContainer" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.515547 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.515661 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.515678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.515703 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.515718 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.623619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.624085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.624252 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.624469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.624651 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.729181 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.729245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.729254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.729271 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.729285 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.832563 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.832616 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.832629 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.832651 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.832665 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.935866 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.935932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.935950 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.935975 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:43 crc kubenswrapper[4696]: I1202 22:43:43.935994 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:43Z","lastTransitionTime":"2025-12-02T22:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.038877 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.038957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.038980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.039012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.039034 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.143099 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.143592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.143805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.143974 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.144125 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.248331 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.248398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.248415 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.248442 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.248460 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.352564 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.352646 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.352671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.352704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.352725 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.430905 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.431111 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.431370 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:44 crc kubenswrapper[4696]: E1202 22:43:44.431364 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.431473 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:44 crc kubenswrapper[4696]: E1202 22:43:44.431564 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:44 crc kubenswrapper[4696]: E1202 22:43:44.431702 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:44 crc kubenswrapper[4696]: E1202 22:43:44.432008 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.456147 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.456440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.456591 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.456782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.456923 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.560820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.560917 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.560941 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.560980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.561003 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.665027 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.665083 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.665132 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.665154 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.665170 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.768808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.768876 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.768897 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.768929 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.768956 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.873365 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.873429 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.873446 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.873472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.873493 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.976433 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.976486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.976505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.976532 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:44 crc kubenswrapper[4696]: I1202 22:43:44.976551 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:44Z","lastTransitionTime":"2025-12-02T22:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.079285 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.079343 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.079359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.079382 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.079397 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.182844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.182932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.182957 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.182990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.183013 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.286450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.286521 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.286538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.286568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.286587 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.390399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.390463 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.390481 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.390510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.390531 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.493670 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.493728 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.493782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.493819 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.493841 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.651999 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.652045 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.652086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.652114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.652129 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.754866 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.754910 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.754925 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.754947 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.754960 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.857590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.857630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.857638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.857653 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.857663 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.966127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.966182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.966197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.966221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:45 crc kubenswrapper[4696]: I1202 22:43:45.966231 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:45Z","lastTransitionTime":"2025-12-02T22:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.069672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.070265 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.070284 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.070312 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.070331 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.164706 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/2.log" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.167984 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.172627 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.172684 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.172702 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.172730 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.172799 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.277361 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.277463 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.277496 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.277535 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.277559 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.380685 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.380781 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.380811 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.380844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.380863 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.431606 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.431804 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.431813 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.431679 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.431965 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.432152 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.432318 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.432597 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.484669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.484818 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.484848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.484889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.484913 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.587759 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.587826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.587844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.587868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.587888 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.691043 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.691133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.691166 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.691199 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.691225 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.779832 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.779879 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.779889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.779905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.779915 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.795943 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.802144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.802223 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.802242 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.802271 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.802291 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.824079 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.829491 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.829564 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.829610 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.829645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.829672 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.852226 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.856879 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.856937 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.856956 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.856991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.857014 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.876161 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.881568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.881607 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.881620 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.881640 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.881655 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.901469 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:46Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:46 crc kubenswrapper[4696]: E1202 22:43:46.901798 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.905018 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.905103 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.905120 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.905150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:46 crc kubenswrapper[4696]: I1202 22:43:46.905168 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:46Z","lastTransitionTime":"2025-12-02T22:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.008419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.008470 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.008489 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.008515 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.008534 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.113080 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.113149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.113172 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.113196 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.113210 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.175762 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/3.log" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.176514 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/2.log" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.179867 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" exitCode=1 Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.179950 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.180017 4696 scope.go:117] "RemoveContainer" containerID="1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.181579 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:43:47 crc kubenswrapper[4696]: E1202 22:43:47.182177 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.212121 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.216136 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.216202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.216218 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.216241 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.216255 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.237423 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.255562 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.272463 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.287521 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649a9c67-6124-4d3d-b4c4-d095f1181c27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa2f02ac937663ab0258468d91f90599f4d6c2d1f68e36a71c6f9424289562a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.303042 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.318151 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.319501 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.319559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.319577 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.319605 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.319623 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.338685 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.356482 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.377069 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.395079 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.414587 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.422052 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.422093 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.422107 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.422126 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.422139 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.430967 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.443677 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.467342 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.482597 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.515104 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.218920 6764 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219289 6764 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219369 6764 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.219579 6764 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219908 6764 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:43:46.220088 6764 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:46.220503 6764 factory.go:656] Stopping watch factory\\\\nI1202 22:43:46.246291 6764 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1202 22:43:46.246326 6764 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1202 22:43:46.246442 6764 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:46.246506 6764 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:43:46.246661 6764 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.524490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.524555 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.524568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.524590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.524604 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.534138 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.551557 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.574098 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.587797 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.612364 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1282d2e6c1634035b4869e4f9a9689f9e5f599a52a82e02e77a81c6df3e13b0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:13Z\\\",\\\"message\\\":\\\"t handler 2 for removal\\\\nI1202 22:43:13.297137 6337 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:43:13.297142 6337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:43:13.297185 6337 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:13.297274 6337 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:43:13.297609 6337 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:43:13.297650 6337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:43:13.297716 6337 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:43:13.297725 6337 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:43:13.297731 6337 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:43:13.297772 6337 factory.go:656] Stopping watch factory\\\\nI1202 22:43:13.297795 6337 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 22:43:13.297796 6337 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:13.297811 6337 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:43:13.297829 6337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:43:13.297821 6337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nF1202 22:43:13.297933 6337 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.218920 6764 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219289 6764 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219369 6764 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.219579 6764 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219908 6764 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:43:46.220088 6764 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:46.220503 6764 factory.go:656] Stopping watch factory\\\\nI1202 22:43:46.246291 6764 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1202 22:43:46.246326 6764 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1202 22:43:46.246442 6764 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:46.246506 6764 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:43:46.246661 6764 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.627389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.627671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.627785 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.627814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.627864 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.629824 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.646101 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649a9c67-6124-4d3d-b4c4-d095f1181c27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa2f02ac937663ab0258468d91f90599f4d6c2d1f68e36a71c6f9424289562a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.663213 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.686811 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.702497 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.720670 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.730412 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.730457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.730476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.730499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.730518 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.739127 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.758583 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.775028 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.794976 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.811123 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.833274 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.833313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.833325 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.833341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.833351 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.845188 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.870249 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.888845 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.906443 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.923014 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:47Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.936794 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.936856 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.936875 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.936905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:47 crc kubenswrapper[4696]: I1202 22:43:47.936926 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:47Z","lastTransitionTime":"2025-12-02T22:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.039862 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.039918 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.039935 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.039961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.039978 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.143423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.144055 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.144127 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.144201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.144259 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.185267 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/3.log" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.189922 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:43:48 crc kubenswrapper[4696]: E1202 22:43:48.190212 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.210205 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.228351 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.246424 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.247498 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.247554 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.247572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.247594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.247610 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.264961 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.280821 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.310206 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.329992 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.348818 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.352189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.352238 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.352256 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.352281 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.352300 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.370874 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.389149 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.412252 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.431710 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.431794 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:48 crc kubenswrapper[4696]: E1202 22:43:48.431952 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.432069 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.432143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:48 crc kubenswrapper[4696]: E1202 22:43:48.432201 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:48 crc kubenswrapper[4696]: E1202 22:43:48.432315 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:48 crc kubenswrapper[4696]: E1202 22:43:48.432381 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.433634 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.455244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.455293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.455310 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.455335 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.455354 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.462830 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.218920 6764 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219289 6764 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219369 6764 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.219579 6764 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219908 6764 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:43:46.220088 6764 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:46.220503 6764 factory.go:656] Stopping watch factory\\\\nI1202 22:43:46.246291 6764 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1202 22:43:46.246326 6764 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1202 22:43:46.246442 6764 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:46.246506 6764 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:43:46.246661 6764 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.482085 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.499407 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649a9c67-6124-4d3d-b4c4-d095f1181c27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa2f02ac937663ab0258468d91f90599f4d6c2d1f68e36a71c6f9424289562a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.520592 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.553662 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.559640 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.559685 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.559698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.559721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.559824 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.568672 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.588279 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.663816 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.663884 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.663898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.663924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.663946 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.766658 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.766784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.766805 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.766833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.766854 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.870290 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.870353 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.870372 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.870400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.870419 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.973991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.974063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.974085 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.974113 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:48 crc kubenswrapper[4696]: I1202 22:43:48.974132 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:48Z","lastTransitionTime":"2025-12-02T22:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.076687 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.076814 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.076842 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.076878 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.076903 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.180231 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.180293 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.180306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.180328 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.180343 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.283671 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.283720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.283731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.283771 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.283786 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.386866 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.386932 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.386943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.386960 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.386970 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.489820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.489913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.489926 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.489947 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.489961 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.593124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.593207 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.593228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.593259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.593280 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.696592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.696680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.696697 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.696728 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.696794 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.799545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.799963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.800184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.800357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.800555 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.903731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.903844 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.903871 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.903906 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:49 crc kubenswrapper[4696]: I1202 22:43:49.903931 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:49Z","lastTransitionTime":"2025-12-02T22:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.006904 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.007272 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.007465 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.007672 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.007952 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.111189 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.111544 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.111718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.111900 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.112047 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.215141 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.215201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.215219 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.215248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.215267 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.318156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.318228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.318247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.318275 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.318296 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.421066 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.421112 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.421125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.421145 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.421158 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.431318 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:50 crc kubenswrapper[4696]: E1202 22:43:50.431484 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.431776 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:50 crc kubenswrapper[4696]: E1202 22:43:50.431850 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.432007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:50 crc kubenswrapper[4696]: E1202 22:43:50.432087 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.432252 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:50 crc kubenswrapper[4696]: E1202 22:43:50.432333 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.524272 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.524334 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.524353 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.524376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.524396 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.628062 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.628116 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.628126 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.628144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.628156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.731183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.731251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.731270 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.731297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.731319 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.834898 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.835004 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.835034 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.835070 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.835095 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.938261 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.938338 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.938360 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.938389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:50 crc kubenswrapper[4696]: I1202 22:43:50.938410 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:50Z","lastTransitionTime":"2025-12-02T22:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.042313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.042404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.042427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.042459 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.042481 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.146924 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.146994 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.147013 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.147042 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.147060 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.251253 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.251341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.251360 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.251394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.251414 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.355135 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.355206 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.355225 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.355250 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.355268 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.458202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.458264 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.458282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.458305 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.458323 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.562288 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.562346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.562364 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.562389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.562407 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.666419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.666497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.666514 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.666543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.666565 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.770096 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.770165 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.770183 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.770211 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.770230 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.873933 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.873987 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.874003 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.874030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.874048 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.977319 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.977369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.977388 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.977411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:51 crc kubenswrapper[4696]: I1202 22:43:51.977427 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:51Z","lastTransitionTime":"2025-12-02T22:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.080138 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.080247 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.080272 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.080316 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.080343 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.183711 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.183803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.183821 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.183852 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.183870 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.286294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.286369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.286423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.286456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.286480 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.388955 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.388999 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.389009 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.389023 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.389032 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.431569 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.431613 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.431919 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.431986 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:52 crc kubenswrapper[4696]: E1202 22:43:52.432158 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:52 crc kubenswrapper[4696]: E1202 22:43:52.432357 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:52 crc kubenswrapper[4696]: E1202 22:43:52.432494 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:52 crc kubenswrapper[4696]: E1202 22:43:52.432671 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.492079 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.492119 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.492129 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.492144 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.492157 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.546406 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.547036 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:43:52 crc kubenswrapper[4696]: E1202 22:43:52.547172 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.595147 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.595193 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.595204 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.595221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.595232 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.698352 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.698407 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.698423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.698450 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.698467 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.802190 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.802250 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.802296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.802320 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.802338 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.906329 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.906394 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.906411 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.906437 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:52 crc kubenswrapper[4696]: I1202 22:43:52.906455 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:52Z","lastTransitionTime":"2025-12-02T22:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.010049 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.010129 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.010150 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.010184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.010208 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.113264 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.113314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.113332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.113361 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.113381 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.231216 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.231282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.231300 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.231330 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.231351 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.334560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.334619 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.334636 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.334665 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.334688 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.438281 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.438345 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.438359 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.438379 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.438391 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.542538 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.542595 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.542612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.542630 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.542643 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.647305 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.647384 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.647399 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.647423 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.647446 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.751357 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.751435 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.751460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.751499 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.751524 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.855088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.855147 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.855168 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.855197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.855218 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.958873 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.958916 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.958938 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.958963 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:53 crc kubenswrapper[4696]: I1202 22:43:53.958985 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:53Z","lastTransitionTime":"2025-12-02T22:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.063310 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.063378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.063392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.063416 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.063432 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.166954 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.167039 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.167063 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.167093 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.167118 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.270913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.270984 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.271025 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.271053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.271074 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.374182 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.374242 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.374259 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.374283 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.374302 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.431265 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:54 crc kubenswrapper[4696]: E1202 22:43:54.431460 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.432011 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.432063 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:54 crc kubenswrapper[4696]: E1202 22:43:54.432176 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:54 crc kubenswrapper[4696]: E1202 22:43:54.432296 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.432412 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:54 crc kubenswrapper[4696]: E1202 22:43:54.432503 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.478332 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.478408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.478425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.478460 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.478477 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.582850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.582952 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.582972 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.583044 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.583088 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.686323 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.686401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.686425 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.686458 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.686478 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.789156 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.789245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.789279 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.789308 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.789328 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.893149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.893223 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.893248 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.893279 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.893300 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.996783 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.996863 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.996883 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.996915 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:54 crc kubenswrapper[4696]: I1202 22:43:54.996943 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:54Z","lastTransitionTime":"2025-12-02T22:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.100855 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.100953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.100980 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.101016 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.101037 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.203992 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.204088 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.204114 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.204149 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.204171 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.308666 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.308870 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.308913 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.308954 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.308979 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.412377 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.412439 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.412456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.412486 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.412506 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.516313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.516383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.516405 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.516433 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.516451 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.619445 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.619485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.619495 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.619510 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.619520 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.722214 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.722274 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.722288 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.722314 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.722330 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.825457 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.825513 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.825524 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.825542 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.825552 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.929400 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.929454 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.929469 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.929490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:55 crc kubenswrapper[4696]: I1202 22:43:55.929505 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:55Z","lastTransitionTime":"2025-12-02T22:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.032858 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.032951 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.032978 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.033013 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.033038 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.136669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.136733 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.136773 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.136796 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.136818 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.240383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.241169 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.241494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.241683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.241871 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.345668 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.345799 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.345826 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.345859 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.345881 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.431573 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.431697 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:56 crc kubenswrapper[4696]: E1202 22:43:56.433119 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.431716 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.431715 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:56 crc kubenswrapper[4696]: E1202 22:43:56.433210 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:56 crc kubenswrapper[4696]: E1202 22:43:56.433667 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:56 crc kubenswrapper[4696]: E1202 22:43:56.433686 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.454139 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.454243 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.454267 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.454301 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.454329 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.558997 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.559078 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.559097 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.559125 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.559146 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.662606 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.662673 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.662690 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.662716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.662737 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.765488 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.765558 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.765570 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.765594 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.765612 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.869251 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.869323 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.869340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.869371 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.869391 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.974001 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.974059 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.974069 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.974087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:56 crc kubenswrapper[4696]: I1202 22:43:56.974098 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:56Z","lastTransitionTime":"2025-12-02T22:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.077549 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.077599 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.077612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.077632 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.077644 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.089867 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.089965 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.089993 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.090024 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.090048 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.106298 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.112206 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.112262 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.112274 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.112297 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.112308 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.127175 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.131163 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.131201 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.131210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.131228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.131240 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.145047 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.148819 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.148860 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.148872 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.148895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.148910 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.159980 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.163592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.163636 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.163648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.163667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.163677 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.175824 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b680025d-da08-4b46-a4a4-b21ac19e4f7b\\\",\\\"systemUUID\\\":\\\"be4c1bf2-c508-4b46-be45-7efaea566193\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: E1202 22:43:57.175953 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.180282 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.180322 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.180334 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.180352 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.180362 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.282792 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.282867 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.282890 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.282921 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.282942 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.385458 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.385526 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.385543 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.385571 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.385607 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.450821 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649a9c67-6124-4d3d-b4c4-d095f1181c27\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa2f02ac937663ab0258468d91f90599f4d6c2d1f68e36a71c6f9424289562a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73634f725abb1c9e590f4fc31e5ea79d879b17daad24832385ceb37d9aa93224\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.466659 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05940584-6baa-49e1-8959-610d50b6eeb5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b361514e74dd950d7c3e1da82304b7c75603829ce7bf03c1fbccab1607d65800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d15263c6c1f8208a998cbacdba376a2eb3b2e16134e8fc40f7d56d36daae5ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2fa27b4eeb8b50da5d49c905eb98070362e998c3e3e4aceac7cc09aaef5b5ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.488699 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sbjst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0d758e-6da6-4382-99f1-dd295b63eb98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bfe4ce91e801752a3eae0d5cfdb5d0e7a4997952987a3a32c10c0292db170f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://495f007b6af2d409e3ed6cf70dfa90aec5a38c34361c9da9c4fd71d7732cdd76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b3b6699e2f0b9a8d0f32c1f97c9b3e16d9f5479ad2f355633c25897dbe1bb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4a10437554cfb08f90a48d457ef157309ed0ab379c5b26deb62d3e8b73291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b52b0824f65df456ead0ca134ada0a6c3120382a3cce18f0a5dfdcc98fa9ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38fb242bd536b1359095c21c4df58b5a4c64058cfe986b38a581ae0c40a29d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f17767f87136d0916230872e59379331bb9e7f88f3648115905c6c0172f257cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sbjst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.488971 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.489035 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.489047 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.489069 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.489081 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.502734 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53353260-c7c9-435c-91eb-3d5a1b441c4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2d1f154b9c9139bf991adf86bd99e7ff14510c24ae93b87c1c184437a087f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7tfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-chq65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.513862 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rgk7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07db09ff-2489-4357-af38-aca9655ac1d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9928cbc6ed1d9bb5d0280a8d3264d4490c601fc7d09d6f293f1403615cc4cde5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fdd8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rgk7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.532399 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1db9cf23af7fc9ff1373908e11df0bea7d6e03cd3ed18947cc583fc09a79a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.550983 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2fa3867298545942c4a2d2e68899c61a4f99a64b0974e75ef86851074ddfce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db552c4bc9b395f977cb4c1b1b7fcbb1c7e9f326d81a7903d88eab180bcfb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.568421 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fb05354e4e9a6ad09946fad7dbabf07456fe56a9a66578fda0eb71dde2d199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.584133 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wthxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a37d2a-37c5-4fbd-b10b-f5e4706772f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:26Z\\\",\\\"message\\\":\\\"2025-12-02T22:42:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09\\\\n2025-12-02T22:42:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c372d58b-05eb-4944-ba5c-2ee7d55c7d09 to /host/opt/cni/bin/\\\\n2025-12-02T22:42:41Z [verbose] multus-daemon started\\\\n2025-12-02T22:42:41Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:43:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:43:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wthxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.591506 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.591551 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.591560 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.591576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.591588 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.600313 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca48946e-a7e0-4729-8b02-b223a96990c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55cb8b25a5af0f7f16b7a91ab00cbe24ed8a8b81477e992de74699237caf7dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e10c7f45d8003857b8a736f70eb3125f01996bacbcace4ad08bc703fbdeb62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkjwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttxcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.621780 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3172221-5c4c-4409-b50f-5cd21d8b5030\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f25e7ebcca1d904cd0940a660fd97c6b4a3540fd63700abd0ad973e9f9c8f764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fc65f2c8162f27bc40cd12b5801ca21c778313271f2cb5dd251ca4312842d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba390cca9997cf469ae88a4895532df556f0b947ce7b34ef7e1992ee236a2e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0ccd6eaff0fa7ac496aadc31afffd6e7fee2776e9cfd1652af85fcec268d895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2deed2650220f1d607cbb667b26879d3a8373146244feb486c882032f344b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a565cb3800c1a50f784c327719158de2f6f29d20a33670478b97c1c360e0f6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a2b9cc76fc45b6874d85b2ab92165a02804807837501ac60190c232bab928d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb3311ccf2a924605464d657193627798dad26a232fab0a2091fb2c387f3bd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.637150 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 22:42:37.965183 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 22:42:37.965319 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:42:37.967400 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2369937012/tls.crt::/tmp/serving-cert-2369937012/tls.key\\\\\\\"\\\\nI1202 22:42:38.329833 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:42:38.331704 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:42:38.331724 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:42:38.331766 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:42:38.331782 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:42:38.336692 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:42:38.336734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336761 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:42:38.336769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:42:38.336775 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:42:38.336779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:42:38.336783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 22:42:38.337216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 22:42:38.338194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.649182 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102b6bc9-3f5d-462f-9814-67699eea05ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abcdac2e562f88dfe0b9c02e2a2efb68c3b4d5438abfa08606b27dde0da76b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c2f64bb4118ce11d7f91dd949ab6e377708029cb0b3c93bd650ac70ea2b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e41b58b3df7b93de109c3904416a2943c8dca8cc7e9edc628333d4da461412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7321d8133c89c6a58d6e42e8df0ba1e4bbdf9cc06fd01054a25afd64934d855a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.663261 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.672632 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f57qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf8ea236-9be9-4eb2-904a-103c4c279f28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3106b29a0eb0a17827fd331aefd6c00b0c58f4ec0d0ca778bd3f4842730d5f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vxprw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f57qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.685131 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.694440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.694545 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.694568 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.694601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.694620 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.695988 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.723945 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:43:46Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.218920 6764 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219289 6764 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219369 6764 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 22:43:46.219579 6764 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:43:46.219908 6764 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:43:46.220088 6764 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:43:46.220503 6764 factory.go:656] Stopping watch factory\\\\nI1202 22:43:46.246291 6764 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1202 22:43:46.246326 6764 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1202 22:43:46.246442 6764 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:43:46.246506 6764 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:43:46.246661 6764 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:43:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:42:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:42:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:42:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dk6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qb2zq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.740928 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad00195c-ef4c-4d9b-941c-d01ebc498593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:42:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz6sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:42:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9bfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:43:57Z is after 2025-08-24T17:21:41Z" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.798464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.798549 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.798577 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.798612 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.798638 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.901028 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.901094 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.901110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.901133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:57 crc kubenswrapper[4696]: I1202 22:43:57.901149 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:57Z","lastTransitionTime":"2025-12-02T22:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.005339 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.005408 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.005421 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.005440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.005451 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.107497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.107542 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.107552 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.107569 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.107581 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.210325 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.210823 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.210836 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.210857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.210870 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.313626 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.313718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.313793 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.313832 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.313858 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.417638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.417723 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.417808 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.417848 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.417873 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.431471 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.431534 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.431552 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.431575 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.431630 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.431830 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.431983 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.432099 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.520666 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.520731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.520786 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.520820 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.520840 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.550591 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.550789 4696 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:58 crc kubenswrapper[4696]: E1202 22:43:58.550865 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs podName:ad00195c-ef4c-4d9b-941c-d01ebc498593 nodeName:}" failed. No retries permitted until 2025-12-02 22:45:02.550846796 +0000 UTC m=+165.431526807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs") pod "network-metrics-daemon-q9bfc" (UID: "ad00195c-ef4c-4d9b-941c-d01ebc498593") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.625648 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.625807 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.625832 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.625865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.625886 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.729572 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.729663 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.729683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.729709 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.729727 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.832383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.832427 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.832440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.832463 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.832476 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.935181 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.935229 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.935244 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.935269 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:58 crc kubenswrapper[4696]: I1202 22:43:58.935283 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:58Z","lastTransitionTime":"2025-12-02T22:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.038404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.038467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.038478 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.038494 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.038510 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.143012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.143087 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.143105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.143136 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.143156 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.245891 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.245969 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.245988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.246017 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.246036 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.348661 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.348725 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.348757 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.348780 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.348795 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.451877 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.451949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.451966 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.451988 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.452005 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.556195 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.556261 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.556279 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.556305 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.556323 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.659585 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.659639 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.659714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.659772 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.659793 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.763716 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.763821 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.763847 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.763889 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.763913 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.867707 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.867817 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.867835 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.867866 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.867886 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.971057 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.971123 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.971137 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.971159 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:43:59 crc kubenswrapper[4696]: I1202 22:43:59.971174 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:43:59Z","lastTransitionTime":"2025-12-02T22:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.073586 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.073635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.073649 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.073667 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.073680 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.176841 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.176892 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.176909 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.176929 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.176940 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.279302 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.279380 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.279398 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.279422 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.279439 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.383012 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.383076 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.383086 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.383104 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.383115 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.431460 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.431481 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:00 crc kubenswrapper[4696]: E1202 22:44:00.431723 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.431482 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:00 crc kubenswrapper[4696]: E1202 22:44:00.431838 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.431490 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:00 crc kubenswrapper[4696]: E1202 22:44:00.431997 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:00 crc kubenswrapper[4696]: E1202 22:44:00.432077 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.486302 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.486348 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.486362 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.486383 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.486396 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.589652 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.589720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.589790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.589829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.589850 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.693718 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.694041 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.694097 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.694124 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.694138 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.797410 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.797931 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.798133 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.798331 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.798531 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.902505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.902576 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.902596 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.902624 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:00 crc kubenswrapper[4696]: I1202 22:44:00.902643 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:00Z","lastTransitionTime":"2025-12-02T22:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.006466 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.006566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.006595 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.006638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.006666 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.109401 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.109464 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.109482 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.109509 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.109527 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.212455 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.212493 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.212503 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.212519 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.212529 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.315294 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.315346 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.315358 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.315378 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.315390 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.418112 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.418169 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.418184 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.418204 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.418220 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.520665 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.520717 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.520729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.520763 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.520774 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.624105 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.624177 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.624197 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.624228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.624253 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.727145 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.727210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.727228 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.727254 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.727272 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.830313 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.830376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.830392 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.830413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.830426 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.933245 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.933296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.933306 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.933368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:01 crc kubenswrapper[4696]: I1202 22:44:01.933383 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:01Z","lastTransitionTime":"2025-12-02T22:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.037324 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.037397 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.037418 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.037448 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.037469 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.140868 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.140939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.140953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.140976 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.140988 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.243413 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.243456 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.243468 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.243485 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.243500 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.346693 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.346764 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.346782 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.346802 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.346817 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.431823 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.431939 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.431847 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:02 crc kubenswrapper[4696]: E1202 22:44:02.432034 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.431948 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:02 crc kubenswrapper[4696]: E1202 22:44:02.432360 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:02 crc kubenswrapper[4696]: E1202 22:44:02.432592 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:02 crc kubenswrapper[4696]: E1202 22:44:02.432704 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.450173 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.450577 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.450708 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.450843 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.450940 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.554634 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.554683 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.554695 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.554713 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.554727 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.658310 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.658368 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.658390 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.658419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.658440 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.761708 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.761790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.761809 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.761833 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.761850 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.865606 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.865669 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.865686 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.865712 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.865730 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.968890 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.968927 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.968939 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.968961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:02 crc kubenswrapper[4696]: I1202 22:44:02.968975 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:02Z","lastTransitionTime":"2025-12-02T22:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.072953 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.073030 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.073056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.073095 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.073121 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.176681 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.176788 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.176817 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.176845 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.176871 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.280264 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.280324 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.280340 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.280364 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.280382 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.383497 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.383592 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.383615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.383647 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.383668 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.486329 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.486404 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.486416 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.486435 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.486446 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.589438 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.589517 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.589540 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.589581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.589605 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.693255 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.693322 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.693345 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.693375 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.693395 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.796471 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.796541 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.796559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.796589 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.796611 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.899851 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.899920 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.899936 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.899961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:03 crc kubenswrapper[4696]: I1202 22:44:03.899979 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:03Z","lastTransitionTime":"2025-12-02T22:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.004134 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.004210 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.004230 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.004257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.004275 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.107603 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.107678 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.107700 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.107731 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.107799 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.211865 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.211943 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.211961 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.211990 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.212012 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.315257 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.315428 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.315461 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.315491 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.315512 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.419528 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.419677 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.419806 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.419837 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.419895 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.431098 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.431194 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.431242 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.431424 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:04 crc kubenswrapper[4696]: E1202 22:44:04.431398 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:04 crc kubenswrapper[4696]: E1202 22:44:04.431845 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:04 crc kubenswrapper[4696]: E1202 22:44:04.432039 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:04 crc kubenswrapper[4696]: E1202 22:44:04.432166 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.523389 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.523484 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.523502 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.523529 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.523548 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.635355 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.635799 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.635819 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.635850 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.635870 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.739587 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.739708 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.739736 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.739904 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.739997 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.844296 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.844396 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.844419 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.844443 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.844460 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.947570 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.947635 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.947651 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.947679 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:04 crc kubenswrapper[4696]: I1202 22:44:04.947697 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:04Z","lastTransitionTime":"2025-12-02T22:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.050522 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.050598 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.050617 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.050645 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.050664 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.154476 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.154895 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.154923 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.154949 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.154969 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.258566 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.258611 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.258621 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.258638 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.258649 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.362361 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.362452 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.362474 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.362505 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.362529 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.466721 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.466857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.466876 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.466903 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.466921 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.570496 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.570597 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.570615 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.570646 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.570665 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.673704 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.673829 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.673849 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.673880 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.673900 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.777526 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.777623 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.777659 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.777698 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.777727 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.881121 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.881198 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.881221 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.881287 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.881311 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.984891 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.984967 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.984991 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.985053 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:05 crc kubenswrapper[4696]: I1202 22:44:05.985081 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:05Z","lastTransitionTime":"2025-12-02T22:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.088737 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.089056 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.089081 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.089110 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.089130 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.192202 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.192291 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.192312 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.192341 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.192361 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.295812 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.295869 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.295881 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.295906 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.295918 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.399135 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.399194 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.399212 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.399236 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.399257 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.430866 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:06 crc kubenswrapper[4696]: E1202 22:44:06.431019 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.431040 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.431028 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.431430 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:06 crc kubenswrapper[4696]: E1202 22:44:06.431514 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:06 crc kubenswrapper[4696]: E1202 22:44:06.431794 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:06 crc kubenswrapper[4696]: E1202 22:44:06.431813 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.432183 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:44:06 crc kubenswrapper[4696]: E1202 22:44:06.432455 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.502646 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.502715 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.502729 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.502800 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.502822 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.605714 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.605790 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.605803 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.605825 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.605838 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.708694 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.708801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.708822 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.708849 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.708864 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.812369 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.812440 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.812467 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.812495 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.812515 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.916581 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.916676 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.916696 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.916727 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:06 crc kubenswrapper[4696]: I1202 22:44:06.916768 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:06Z","lastTransitionTime":"2025-12-02T22:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.019542 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.019590 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.019601 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.019622 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.019635 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.122376 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.122472 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.122490 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.122559 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.122671 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.226038 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.226116 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.226128 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.226146 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.226177 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.328720 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.328784 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.328795 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.328810 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.328824 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.432680 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.432763 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.432777 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.432801 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.432818 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.457827 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wthxr" podStartSLOduration=88.457798469 podStartE2EDuration="1m28.457798469s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.456920104 +0000 UTC m=+110.337600145" watchObservedRunningTime="2025-12-02 22:44:07.457798469 +0000 UTC m=+110.338478510" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.502199 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttxcl" podStartSLOduration=87.50217058 podStartE2EDuration="1m27.50217058s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.476996979 +0000 UTC m=+110.357677010" watchObservedRunningTime="2025-12-02 22:44:07.50217058 +0000 UTC m=+110.382850621" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.535480 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.535902 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.536048 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.536153 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.536248 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.557857 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.557905 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.557942 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.557964 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.557980 4696 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:44:07Z","lastTransitionTime":"2025-12-02T22:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.598395 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f57qk" podStartSLOduration=88.598370435 podStartE2EDuration="1m28.598370435s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.598065996 +0000 UTC m=+110.478745997" watchObservedRunningTime="2025-12-02 22:44:07.598370435 +0000 UTC m=+110.479050446" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.620665 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz"] Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.621443 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.623955 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.624339 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.625131 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.628180 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.634479 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.634446148 podStartE2EDuration="1m28.634446148s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.628306412 +0000 UTC m=+110.508986483" watchObservedRunningTime="2025-12-02 22:44:07.634446148 +0000 UTC m=+110.515126189" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.648123 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.648102819 podStartE2EDuration="1m29.648102819s" podCreationTimestamp="2025-12-02 22:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.647683227 +0000 UTC m=+110.528363228" watchObservedRunningTime="2025-12-02 22:44:07.648102819 +0000 UTC m=+110.528782820" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.681772 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.681716152 podStartE2EDuration="1m3.681716152s" podCreationTimestamp="2025-12-02 22:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.670684956 +0000 UTC m=+110.551364957" watchObservedRunningTime="2025-12-02 22:44:07.681716152 +0000 UTC m=+110.562396143" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.754446 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.754921 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.755012 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e233bd07-a896-4fc0-a07c-4555941211c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.755103 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e233bd07-a896-4fc0-a07c-4555941211c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.755195 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e233bd07-a896-4fc0-a07c-4555941211c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.766247 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rgk7n" podStartSLOduration=88.766224742 podStartE2EDuration="1m28.766224742s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.765725308 +0000 UTC m=+110.646405309" watchObservedRunningTime="2025-12-02 22:44:07.766224742 +0000 UTC m=+110.646904743" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.767022 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podStartSLOduration=88.767015905 podStartE2EDuration="1m28.767015905s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.753058465 +0000 UTC m=+110.633738466" watchObservedRunningTime="2025-12-02 22:44:07.767015905 +0000 UTC m=+110.647695906" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.776483 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=27.776463925 podStartE2EDuration="27.776463925s" podCreationTimestamp="2025-12-02 22:43:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.775611091 +0000 UTC m=+110.656291092" watchObservedRunningTime="2025-12-02 22:44:07.776463925 +0000 UTC m=+110.657143926" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.792010 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=87.791963859 podStartE2EDuration="1m27.791963859s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.789530089 +0000 UTC m=+110.670210090" watchObservedRunningTime="2025-12-02 22:44:07.791963859 +0000 UTC m=+110.672643860" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.810842 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sbjst" podStartSLOduration=88.810814639 podStartE2EDuration="1m28.810814639s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:07.810672135 +0000 UTC m=+110.691352406" watchObservedRunningTime="2025-12-02 22:44:07.810814639 +0000 UTC m=+110.691494640" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856293 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856352 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856379 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e233bd07-a896-4fc0-a07c-4555941211c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e233bd07-a896-4fc0-a07c-4555941211c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e233bd07-a896-4fc0-a07c-4555941211c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.856489 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e233bd07-a896-4fc0-a07c-4555941211c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.857419 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e233bd07-a896-4fc0-a07c-4555941211c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.866624 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e233bd07-a896-4fc0-a07c-4555941211c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.872557 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e233bd07-a896-4fc0-a07c-4555941211c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7vgjz\" (UID: \"e233bd07-a896-4fc0-a07c-4555941211c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:07 crc kubenswrapper[4696]: I1202 22:44:07.946153 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.262022 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" event={"ID":"e233bd07-a896-4fc0-a07c-4555941211c2","Type":"ContainerStarted","Data":"abc302037cd5995f5bb1d09f24907e36c9c3a1a794fc658504117be69d054002"} Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.262092 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" event={"ID":"e233bd07-a896-4fc0-a07c-4555941211c2","Type":"ContainerStarted","Data":"36b9617ed4facb80ab2b08c7ac8918eb4bd61b5cffc2825a319417a27dbfdf21"} Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.431238 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:08 crc kubenswrapper[4696]: E1202 22:44:08.431461 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.431575 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:08 crc kubenswrapper[4696]: E1202 22:44:08.431657 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.432040 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:08 crc kubenswrapper[4696]: I1202 22:44:08.432123 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:08 crc kubenswrapper[4696]: E1202 22:44:08.432324 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:08 crc kubenswrapper[4696]: E1202 22:44:08.432474 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:10 crc kubenswrapper[4696]: I1202 22:44:10.431337 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:10 crc kubenswrapper[4696]: I1202 22:44:10.431345 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:10 crc kubenswrapper[4696]: I1202 22:44:10.431373 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:10 crc kubenswrapper[4696]: E1202 22:44:10.432795 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:10 crc kubenswrapper[4696]: I1202 22:44:10.431403 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:10 crc kubenswrapper[4696]: E1202 22:44:10.433026 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:10 crc kubenswrapper[4696]: E1202 22:44:10.433111 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:10 crc kubenswrapper[4696]: E1202 22:44:10.433345 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:12 crc kubenswrapper[4696]: I1202 22:44:12.431056 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:12 crc kubenswrapper[4696]: I1202 22:44:12.431118 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:12 crc kubenswrapper[4696]: E1202 22:44:12.432072 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:12 crc kubenswrapper[4696]: I1202 22:44:12.431216 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:12 crc kubenswrapper[4696]: I1202 22:44:12.431140 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:12 crc kubenswrapper[4696]: E1202 22:44:12.432301 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:12 crc kubenswrapper[4696]: E1202 22:44:12.432442 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:12 crc kubenswrapper[4696]: E1202 22:44:12.432603 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.278942 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/1.log" Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.279512 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/0.log" Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.279557 4696 generic.go:334] "Generic (PLEG): container finished" podID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" containerID="32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f" exitCode=1 Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.279594 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerDied","Data":"32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f"} Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.279636 4696 scope.go:117] "RemoveContainer" containerID="c5ddd341a3a0b7b1d959ce57bb4d38153a4ca3ae5ec40ded309e0697db2be28d" Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.280069 4696 scope.go:117] "RemoveContainer" containerID="32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f" Dec 02 22:44:13 crc kubenswrapper[4696]: E1202 22:44:13.280226 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wthxr_openshift-multus(86a37d2a-37c5-4fbd-b10b-f5e4706772f4)\"" pod="openshift-multus/multus-wthxr" podUID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" Dec 02 22:44:13 crc kubenswrapper[4696]: I1202 22:44:13.309287 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7vgjz" podStartSLOduration=94.309223017 podStartE2EDuration="1m34.309223017s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:09.293939314 +0000 UTC m=+112.174619315" watchObservedRunningTime="2025-12-02 22:44:13.309223017 +0000 UTC m=+116.189903068" Dec 02 22:44:14 crc kubenswrapper[4696]: I1202 22:44:14.293487 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/1.log" Dec 02 22:44:14 crc kubenswrapper[4696]: I1202 22:44:14.430682 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:14 crc kubenswrapper[4696]: I1202 22:44:14.430792 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:14 crc kubenswrapper[4696]: I1202 22:44:14.430704 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:14 crc kubenswrapper[4696]: E1202 22:44:14.430969 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:14 crc kubenswrapper[4696]: E1202 22:44:14.431099 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:14 crc kubenswrapper[4696]: I1202 22:44:14.431173 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:14 crc kubenswrapper[4696]: E1202 22:44:14.431238 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:14 crc kubenswrapper[4696]: E1202 22:44:14.431307 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:16 crc kubenswrapper[4696]: I1202 22:44:16.430894 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:16 crc kubenswrapper[4696]: I1202 22:44:16.430996 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:16 crc kubenswrapper[4696]: I1202 22:44:16.430885 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:16 crc kubenswrapper[4696]: I1202 22:44:16.431116 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:16 crc kubenswrapper[4696]: E1202 22:44:16.431023 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:16 crc kubenswrapper[4696]: E1202 22:44:16.431257 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:16 crc kubenswrapper[4696]: E1202 22:44:16.431374 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:16 crc kubenswrapper[4696]: E1202 22:44:16.431559 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:17 crc kubenswrapper[4696]: E1202 22:44:17.459008 4696 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 22:44:17 crc kubenswrapper[4696]: E1202 22:44:17.561430 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:44:18 crc kubenswrapper[4696]: I1202 22:44:18.431473 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:18 crc kubenswrapper[4696]: I1202 22:44:18.431491 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:18 crc kubenswrapper[4696]: I1202 22:44:18.431535 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:18 crc kubenswrapper[4696]: E1202 22:44:18.432572 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:18 crc kubenswrapper[4696]: I1202 22:44:18.431642 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:18 crc kubenswrapper[4696]: E1202 22:44:18.432827 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:18 crc kubenswrapper[4696]: E1202 22:44:18.432964 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:18 crc kubenswrapper[4696]: E1202 22:44:18.433113 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:20 crc kubenswrapper[4696]: I1202 22:44:20.431202 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:20 crc kubenswrapper[4696]: E1202 22:44:20.431467 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:20 crc kubenswrapper[4696]: I1202 22:44:20.431858 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:20 crc kubenswrapper[4696]: E1202 22:44:20.431957 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:20 crc kubenswrapper[4696]: I1202 22:44:20.433386 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:44:20 crc kubenswrapper[4696]: E1202 22:44:20.433691 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qb2zq_openshift-ovn-kubernetes(c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" Dec 02 22:44:20 crc kubenswrapper[4696]: I1202 22:44:20.433949 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:20 crc kubenswrapper[4696]: E1202 22:44:20.434063 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:20 crc kubenswrapper[4696]: I1202 22:44:20.434326 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:20 crc kubenswrapper[4696]: E1202 22:44:20.434450 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:22 crc kubenswrapper[4696]: I1202 22:44:22.431135 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:22 crc kubenswrapper[4696]: I1202 22:44:22.431184 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:22 crc kubenswrapper[4696]: I1202 22:44:22.431138 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:22 crc kubenswrapper[4696]: E1202 22:44:22.431535 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:22 crc kubenswrapper[4696]: E1202 22:44:22.431360 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:22 crc kubenswrapper[4696]: I1202 22:44:22.431624 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:22 crc kubenswrapper[4696]: E1202 22:44:22.431710 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:22 crc kubenswrapper[4696]: E1202 22:44:22.431844 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:22 crc kubenswrapper[4696]: E1202 22:44:22.563116 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:44:24 crc kubenswrapper[4696]: I1202 22:44:24.431041 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:24 crc kubenswrapper[4696]: I1202 22:44:24.431131 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:24 crc kubenswrapper[4696]: I1202 22:44:24.431191 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:24 crc kubenswrapper[4696]: E1202 22:44:24.431266 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:24 crc kubenswrapper[4696]: I1202 22:44:24.431286 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:24 crc kubenswrapper[4696]: E1202 22:44:24.431442 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:24 crc kubenswrapper[4696]: E1202 22:44:24.431802 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:24 crc kubenswrapper[4696]: E1202 22:44:24.431882 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:25 crc kubenswrapper[4696]: I1202 22:44:25.431945 4696 scope.go:117] "RemoveContainer" containerID="32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f" Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.381278 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/1.log" Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.381908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerStarted","Data":"953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427"} Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.431091 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:26 crc kubenswrapper[4696]: E1202 22:44:26.431255 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.431507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:26 crc kubenswrapper[4696]: E1202 22:44:26.431576 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.431704 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:26 crc kubenswrapper[4696]: E1202 22:44:26.431797 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:26 crc kubenswrapper[4696]: I1202 22:44:26.431978 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:26 crc kubenswrapper[4696]: E1202 22:44:26.432088 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:27 crc kubenswrapper[4696]: E1202 22:44:27.563825 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:44:28 crc kubenswrapper[4696]: I1202 22:44:28.430903 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:28 crc kubenswrapper[4696]: I1202 22:44:28.430961 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:28 crc kubenswrapper[4696]: I1202 22:44:28.431054 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:28 crc kubenswrapper[4696]: I1202 22:44:28.431125 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:28 crc kubenswrapper[4696]: E1202 22:44:28.431062 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:28 crc kubenswrapper[4696]: E1202 22:44:28.431361 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:28 crc kubenswrapper[4696]: E1202 22:44:28.431408 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:28 crc kubenswrapper[4696]: E1202 22:44:28.431536 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:30 crc kubenswrapper[4696]: I1202 22:44:30.430855 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:30 crc kubenswrapper[4696]: I1202 22:44:30.430856 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:30 crc kubenswrapper[4696]: I1202 22:44:30.431126 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:30 crc kubenswrapper[4696]: E1202 22:44:30.431010 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:30 crc kubenswrapper[4696]: E1202 22:44:30.431229 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:30 crc kubenswrapper[4696]: E1202 22:44:30.431486 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:30 crc kubenswrapper[4696]: I1202 22:44:30.432006 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:30 crc kubenswrapper[4696]: E1202 22:44:30.432204 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:32 crc kubenswrapper[4696]: I1202 22:44:32.431239 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:32 crc kubenswrapper[4696]: I1202 22:44:32.431262 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:32 crc kubenswrapper[4696]: I1202 22:44:32.431379 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:32 crc kubenswrapper[4696]: E1202 22:44:32.431569 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:32 crc kubenswrapper[4696]: I1202 22:44:32.431640 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:32 crc kubenswrapper[4696]: E1202 22:44:32.431705 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:32 crc kubenswrapper[4696]: E1202 22:44:32.432122 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:32 crc kubenswrapper[4696]: E1202 22:44:32.432285 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:32 crc kubenswrapper[4696]: E1202 22:44:32.565370 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:44:34 crc kubenswrapper[4696]: I1202 22:44:34.431535 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:34 crc kubenswrapper[4696]: I1202 22:44:34.431596 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:34 crc kubenswrapper[4696]: I1202 22:44:34.431680 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:34 crc kubenswrapper[4696]: E1202 22:44:34.431845 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:34 crc kubenswrapper[4696]: E1202 22:44:34.431966 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:34 crc kubenswrapper[4696]: I1202 22:44:34.432052 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:34 crc kubenswrapper[4696]: E1202 22:44:34.432155 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:34 crc kubenswrapper[4696]: E1202 22:44:34.432428 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:35 crc kubenswrapper[4696]: I1202 22:44:35.432241 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.430616 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.430764 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:36 crc kubenswrapper[4696]: E1202 22:44:36.430798 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.430871 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.430921 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:36 crc kubenswrapper[4696]: E1202 22:44:36.431029 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:36 crc kubenswrapper[4696]: E1202 22:44:36.431171 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:36 crc kubenswrapper[4696]: E1202 22:44:36.431531 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.434108 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/3.log" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.443899 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerStarted","Data":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.445171 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.460587 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q9bfc"] Dec 02 22:44:36 crc kubenswrapper[4696]: I1202 22:44:36.488908 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podStartSLOduration=116.48888034 podStartE2EDuration="1m56.48888034s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:36.484325195 +0000 UTC m=+139.365005196" watchObservedRunningTime="2025-12-02 22:44:36.48888034 +0000 UTC m=+139.369560341" Dec 02 22:44:37 crc kubenswrapper[4696]: I1202 22:44:37.446597 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:37 crc kubenswrapper[4696]: E1202 22:44:37.447583 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:37 crc kubenswrapper[4696]: E1202 22:44:37.566281 4696 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:44:38 crc kubenswrapper[4696]: I1202 22:44:38.431053 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:38 crc kubenswrapper[4696]: I1202 22:44:38.431167 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:38 crc kubenswrapper[4696]: I1202 22:44:38.431229 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:38 crc kubenswrapper[4696]: E1202 22:44:38.431313 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:38 crc kubenswrapper[4696]: E1202 22:44:38.431495 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:38 crc kubenswrapper[4696]: E1202 22:44:38.431619 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:39 crc kubenswrapper[4696]: I1202 22:44:39.432028 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:39 crc kubenswrapper[4696]: E1202 22:44:39.432210 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:40 crc kubenswrapper[4696]: I1202 22:44:40.430783 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:40 crc kubenswrapper[4696]: I1202 22:44:40.430903 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:40 crc kubenswrapper[4696]: I1202 22:44:40.430912 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:40 crc kubenswrapper[4696]: E1202 22:44:40.431050 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:40 crc kubenswrapper[4696]: E1202 22:44:40.431211 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:40 crc kubenswrapper[4696]: E1202 22:44:40.431471 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:41 crc kubenswrapper[4696]: I1202 22:44:41.431808 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:41 crc kubenswrapper[4696]: E1202 22:44:41.432597 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9bfc" podUID="ad00195c-ef4c-4d9b-941c-d01ebc498593" Dec 02 22:44:42 crc kubenswrapper[4696]: I1202 22:44:42.431449 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:42 crc kubenswrapper[4696]: E1202 22:44:42.431586 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:44:42 crc kubenswrapper[4696]: I1202 22:44:42.431791 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:42 crc kubenswrapper[4696]: E1202 22:44:42.431845 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:44:42 crc kubenswrapper[4696]: I1202 22:44:42.432101 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:42 crc kubenswrapper[4696]: E1202 22:44:42.432335 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:44:43 crc kubenswrapper[4696]: I1202 22:44:43.430851 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:44:43 crc kubenswrapper[4696]: I1202 22:44:43.433087 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 22:44:43 crc kubenswrapper[4696]: I1202 22:44:43.434914 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.431412 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.431562 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.431449 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.436024 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.436156 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.436715 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 22:44:44 crc kubenswrapper[4696]: I1202 22:44:44.437588 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.430521 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:46 crc kubenswrapper[4696]: E1202 22:44:46.430959 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:46:48.430892179 +0000 UTC m=+271.311572310 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.532127 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.532207 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.532254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.532308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.543783 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.543970 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.544977 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.548864 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:44:46 crc kubenswrapper[4696]: I1202 22:44:46.559999 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:47 crc kubenswrapper[4696]: I1202 22:44:47.903530 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.066608 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:44:48 crc kubenswrapper[4696]: W1202 22:44:48.215148 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-e2fd831cad9af82c4606384c8d80ea6f23f1a8fa248133dc395609f05d9a7843 WatchSource:0}: Error finding container e2fd831cad9af82c4606384c8d80ea6f23f1a8fa248133dc395609f05d9a7843: Status 404 returned error can't find the container with id e2fd831cad9af82c4606384c8d80ea6f23f1a8fa248133dc395609f05d9a7843 Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.509717 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5b81b76bea572afe2661fa56517e28ddc79daf6986bfd33aadda8ae7d0a44276"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.510182 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7d1a3adbabe3a154979fe5c4ef0cf5221447aa4fe44aa88f9649c2aae4d1bf7d"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.512338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2ce81baf4636f366789b0ed1f3e83f19b8c9c3b9bda256d376f8251fceefad24"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.512362 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c29e229ebacf7e9a1b181da8a1cb84ff9ebaa00c2e172577b42c7a98d59542c1"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.512988 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.514628 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ac8047473a64158e5781a749946d8f263ae822fd3909b3a48a9a450e18326157"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.514680 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e2fd831cad9af82c4606384c8d80ea6f23f1a8fa248133dc395609f05d9a7843"} Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.706701 4696 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.739066 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.739524 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.743579 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.743869 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.744021 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.744159 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.744587 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.745535 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.745559 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.746196 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.746860 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.747081 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8fxbs"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.748183 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.748711 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.749246 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.751071 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.751555 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rttcw"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.752304 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.752436 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6k2l"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.752772 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.754083 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.754657 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.759997 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.760056 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.760458 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.760638 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.760785 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.761126 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.761438 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.761625 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.761786 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.762082 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.762302 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.762586 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-encryption-config\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763442 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-audit-dir\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763468 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763491 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763516 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763537 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763557 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkss8\" (UniqueName: \"kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763579 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed298584-c8e2-43c3-88f1-d95aea472e00-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763603 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763626 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763648 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298584-c8e2-43c3-88f1-d95aea472e00-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-serving-cert\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763691 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763715 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763759 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vmz\" (UniqueName: \"kubernetes.io/projected/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-kube-api-access-m5vmz\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763782 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-image-import-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763807 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-client\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763837 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5r4l\" (UniqueName: \"kubernetes.io/projected/ed298584-c8e2-43c3-88f1-d95aea472e00-kube-api-access-g5r4l\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763864 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffa64292-b071-4bfc-93d6-70d65b00847d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.763941 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764086 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-images\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764120 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764155 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-audit\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764203 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-policies\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764230 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-client\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764253 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjxz\" (UniqueName: \"kubernetes.io/projected/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-kube-api-access-cmjxz\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfwtq\" (UniqueName: \"kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764322 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-node-pullsecrets\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764343 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764396 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-serving-cert\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w22b\" (UniqueName: \"kubernetes.io/projected/ffa64292-b071-4bfc-93d6-70d65b00847d-kube-api-access-4w22b\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764443 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764463 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764534 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-serving-cert\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764560 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-encryption-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764616 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764644 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x66\" (UniqueName: \"kubernetes.io/projected/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-kube-api-access-g6x66\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764677 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764827 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.764995 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.765021 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-dir\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.765056 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbwp\" (UniqueName: \"kubernetes.io/projected/092e9b7e-6772-4cde-89b7-de81ae50222e-kube-api-access-7bbwp\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.765083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-config\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.766782 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zpshp"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.767283 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.767614 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.768077 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.773577 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.774570 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.774630 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.776652 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.777719 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.782304 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.783350 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.783685 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784017 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784139 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784224 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784292 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784381 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784416 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784463 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784516 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.784621 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.785084 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jhcjw"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.785614 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.786611 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.786800 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.786925 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787062 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787087 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787205 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787274 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787444 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787552 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787610 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787684 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787831 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787990 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788137 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788231 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788322 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788401 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788434 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788488 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787215 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788596 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.787998 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788607 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788708 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788724 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788820 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788844 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788902 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788853 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.788914 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.794931 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795148 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795274 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795396 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795540 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795812 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.795953 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.796161 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.819593 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.821497 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.821558 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.822850 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.823522 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.854091 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.854709 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.854851 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.855143 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.855398 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.855916 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.856308 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.856505 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.857321 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.858390 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.865896 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866122 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-dir\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866175 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866440 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866694 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866813 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867046 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867145 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.866172 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867336 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbwp\" (UniqueName: \"kubernetes.io/projected/092e9b7e-6772-4cde-89b7-de81ae50222e-kube-api-access-7bbwp\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867373 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-config\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867411 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867453 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-encryption-config\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867479 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867502 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-audit-dir\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867531 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88fbd4d8-6770-4812-aded-c20e16d0e24b-serving-cert\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867558 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867584 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867612 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867635 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867673 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkss8\" (UniqueName: \"kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867700 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed298584-c8e2-43c3-88f1-d95aea472e00-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867722 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867777 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867799 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-serving-cert\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867820 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867858 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867887 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298584-c8e2-43c3-88f1-d95aea472e00-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867914 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vmz\" (UniqueName: \"kubernetes.io/projected/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-kube-api-access-m5vmz\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867937 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-image-import-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867963 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-client\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867987 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5r4l\" (UniqueName: \"kubernetes.io/projected/ed298584-c8e2-43c3-88f1-d95aea472e00-kube-api-access-g5r4l\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868069 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffa64292-b071-4bfc-93d6-70d65b00847d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868110 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868135 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbnm\" (UniqueName: \"kubernetes.io/projected/88fbd4d8-6770-4812-aded-c20e16d0e24b-kube-api-access-tnbnm\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868156 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868183 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-images\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868205 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-audit\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868249 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-policies\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868273 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-client\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868295 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjxz\" (UniqueName: \"kubernetes.io/projected/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-kube-api-access-cmjxz\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868322 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868343 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlj69\" (UniqueName: \"kubernetes.io/projected/3f45ac7c-8865-4924-8dbd-5826a21d028e-kube-api-access-jlj69\") pod \"downloads-7954f5f757-jhcjw\" (UID: \"3f45ac7c-8865-4924-8dbd-5826a21d028e\") " pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868363 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868387 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfwtq\" (UniqueName: \"kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-node-pullsecrets\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w22b\" (UniqueName: \"kubernetes.io/projected/ffa64292-b071-4bfc-93d6-70d65b00847d-kube-api-access-4w22b\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868458 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868478 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868502 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868523 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-serving-cert\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868553 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-serving-cert\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868575 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-encryption-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868596 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868623 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868648 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x66\" (UniqueName: \"kubernetes.io/projected/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-kube-api-access-g6x66\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-trusted-ca\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868773 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868797 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868818 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-config\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868841 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868862 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868883 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868903 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmm4\" (UniqueName: \"kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868923 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868964 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhvw\" (UniqueName: \"kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.868988 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.869010 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.869036 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.869114 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-config\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.867408 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.869691 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.869965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-audit-dir\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.870076 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.870571 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.871468 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/092e9b7e-6772-4cde-89b7-de81ae50222e-node-pullsecrets\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.871581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.876528 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.876693 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.876724 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-dir\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.877129 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.877249 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.877317 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.878343 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.878731 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887082 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887100 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.880124 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.884008 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.886547 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887225 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.878986 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887297 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qpwvl"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887315 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.887790 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.883549 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298584-c8e2-43c3-88f1-d95aea472e00-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.923855 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-audit\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.924320 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/092e9b7e-6772-4cde-89b7-de81ae50222e-image-import-ca\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.924506 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-audit-policies\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.927257 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.928120 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.928376 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.932278 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.933010 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.933171 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.933841 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-serving-cert\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.934422 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.937057 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.938690 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-etcd-client\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.939083 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-encryption-config\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.939413 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed298584-c8e2-43c3-88f1-d95aea472e00-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.939435 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.939700 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-encryption-config\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.940198 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.940252 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffa64292-b071-4bfc-93d6-70d65b00847d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.940306 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7pzs"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.942574 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.954896 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.955282 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.955330 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.955887 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.955950 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.956092 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.956417 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.956544 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.956892 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-etcd-client\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.956907 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.957005 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbwp\" (UniqueName: \"kubernetes.io/projected/092e9b7e-6772-4cde-89b7-de81ae50222e-kube-api-access-7bbwp\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.957828 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ffa64292-b071-4bfc-93d6-70d65b00847d-images\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.959013 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/092e9b7e-6772-4cde-89b7-de81ae50222e-serving-cert\") pod \"apiserver-76f77b778f-8fxbs\" (UID: \"092e9b7e-6772-4cde-89b7-de81ae50222e\") " pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.959068 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhp72"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.959176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.959504 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.959528 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.960249 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.960344 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.960904 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-serving-cert\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.961513 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.961529 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w22b\" (UniqueName: \"kubernetes.io/projected/ffa64292-b071-4bfc-93d6-70d65b00847d-kube-api-access-4w22b\") pod \"machine-api-operator-5694c8668f-rttcw\" (UID: \"ffa64292-b071-4bfc-93d6-70d65b00847d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.961899 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.966389 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.967668 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.969385 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5r4l\" (UniqueName: \"kubernetes.io/projected/ed298584-c8e2-43c3-88f1-d95aea472e00-kube-api-access-g5r4l\") pod \"openshift-apiserver-operator-796bbdcf4f-vt2b7\" (UID: \"ed298584-c8e2-43c3-88f1-d95aea472e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.969800 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.969857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88fbd4d8-6770-4812-aded-c20e16d0e24b-serving-cert\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.969884 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.969973 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbnm\" (UniqueName: \"kubernetes.io/projected/88fbd4d8-6770-4812-aded-c20e16d0e24b-kube-api-access-tnbnm\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970027 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970083 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlj69\" (UniqueName: \"kubernetes.io/projected/3f45ac7c-8865-4924-8dbd-5826a21d028e-kube-api-access-jlj69\") pod \"downloads-7954f5f757-jhcjw\" (UID: \"3f45ac7c-8865-4924-8dbd-5826a21d028e\") " pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970103 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970130 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970171 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970190 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-trusted-ca\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970206 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmm4\" (UniqueName: \"kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970269 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-config\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970286 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhvw\" (UniqueName: \"kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.970372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.971632 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.972275 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.973168 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.973812 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.974501 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.975525 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.975652 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-config\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.975828 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5df2"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.977047 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.977142 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.978183 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xbr4"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.978650 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.980237 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88fbd4d8-6770-4812-aded-c20e16d0e24b-trusted-ca\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.980280 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.980638 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.980881 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.981355 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.981477 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.982070 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.984966 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.985485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.985535 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vmz\" (UniqueName: \"kubernetes.io/projected/9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3-kube-api-access-m5vmz\") pod \"apiserver-7bbb656c7d-9gpsg\" (UID: \"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.986952 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.987709 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.988623 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.989598 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.989703 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.990475 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.991637 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.991777 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.992449 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.993511 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bghq"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.994030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.995806 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfwtq\" (UniqueName: \"kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq\") pod \"controller-manager-879f6c89f-p44hz\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.997568 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.997639 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn"] Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.998277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkss8\" (UniqueName: \"kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8\") pod \"oauth-openshift-558db77b4-v6k2l\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:48 crc kubenswrapper[4696]: I1202 22:44:48.998577 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.003444 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.004301 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.005384 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88fbd4d8-6770-4812-aded-c20e16d0e24b-serving-cert\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.006121 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.007562 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.008312 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.008872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jhcjw"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.009909 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.010329 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.012219 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.012578 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.012972 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.013083 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.013775 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rttcw"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.015786 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8fxbs"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.017979 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bghq"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.019760 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.022692 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zpshp"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.024139 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.025903 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.026993 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.028179 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.030179 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.033188 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.033491 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.035372 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.041754 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jtr26"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.042930 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.042933 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.052580 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.056679 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6k2l"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.058387 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5df2"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.059365 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.060614 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.062138 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhp72"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.062421 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.063506 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.065019 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.066288 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.067588 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xbr4"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.068580 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.070181 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.071456 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072243 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c10678a-82ec-492c-b8c9-e1689fb7c63b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072425 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072478 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-auth-proxy-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072510 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072854 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.072884 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073111 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkj5\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c10678a-82ec-492c-b8c9-e1689fb7c63b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073204 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073256 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073409 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnmvm\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-kube-api-access-wnmvm\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073478 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5f5d77f1-55f7-49d2-b5db-19d7150b882f-machine-approver-tls\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073566 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.073599 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvxk\" (UniqueName: \"kubernetes.io/projected/5f5d77f1-55f7-49d2-b5db-19d7150b882f-kube-api-access-hnvxk\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.075578 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:49.575560278 +0000 UTC m=+152.456240479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.077132 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4j2"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.077946 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.078037 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.080046 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.082620 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.088095 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.089353 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.090402 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jhb9"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.091714 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7pzs"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.091882 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.093925 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.101366 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4j2"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.104878 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.104901 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jhb9"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.104911 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.107437 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4jgzb"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.108257 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4jgzb"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.108352 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.111456 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.121797 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.145105 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.172016 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174431 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174606 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174656 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174721 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnmvm\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-kube-api-access-wnmvm\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174782 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5f5d77f1-55f7-49d2-b5db-19d7150b882f-machine-approver-tls\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174813 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174855 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvxk\" (UniqueName: \"kubernetes.io/projected/5f5d77f1-55f7-49d2-b5db-19d7150b882f-kube-api-access-hnvxk\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174889 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c10678a-82ec-492c-b8c9-e1689fb7c63b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174940 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.174981 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-auth-proxy-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.175025 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.175063 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.175104 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.175222 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkj5\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.175259 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c10678a-82ec-492c-b8c9-e1689fb7c63b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.177312 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:49.677288647 +0000 UTC m=+152.557968638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.178365 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.178650 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.179388 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.179588 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f5d77f1-55f7-49d2-b5db-19d7150b882f-auth-proxy-config\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.179817 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c10678a-82ec-492c-b8c9-e1689fb7c63b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.182914 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.183507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.185780 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x66\" (UniqueName: \"kubernetes.io/projected/b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0-kube-api-access-g6x66\") pod \"openshift-config-operator-7777fb866f-s5dx8\" (UID: \"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.185900 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c10678a-82ec-492c-b8c9-e1689fb7c63b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.187969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.188313 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.203317 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5f5d77f1-55f7-49d2-b5db-19d7150b882f-machine-approver-tls\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.204134 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.210283 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.221793 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.239111 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded298584_c8e2_43c3_88f1_d95aea472e00.slice/crio-02167c05fdb5bb2363e8c6e9152d8df75427c6acd663b9a74e3f9cd092a7b1b6 WatchSource:0}: Error finding container 02167c05fdb5bb2363e8c6e9152d8df75427c6acd663b9a74e3f9cd092a7b1b6: Status 404 returned error can't find the container with id 02167c05fdb5bb2363e8c6e9152d8df75427c6acd663b9a74e3f9cd092a7b1b6 Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.248990 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.252506 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rttcw"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.257814 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.264645 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.265523 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa64292_b071_4bfc_93d6_70d65b00847d.slice/crio-a4a0a2f2a6bf305a68a4cde7a8b6a47b9469dc0d898d95913b4da9bb05026d05 WatchSource:0}: Error finding container a4a0a2f2a6bf305a68a4cde7a8b6a47b9469dc0d898d95913b4da9bb05026d05: Status 404 returned error can't find the container with id a4a0a2f2a6bf305a68a4cde7a8b6a47b9469dc0d898d95913b4da9bb05026d05 Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.276349 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.276800 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:49.776764859 +0000 UTC m=+152.657444860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.282085 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.290905 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.303914 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.305918 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.323322 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.341646 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.362545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.378149 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.378572 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:49.87854941 +0000 UTC m=+152.759229421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.380590 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef3caa5_75cd_444b_aa84_9116ea9ce1cd.slice/crio-1d776ec15daed1cd0442bd7aa55aa6096bbcc314c3e60ce5e0bdc4f985470464 WatchSource:0}: Error finding container 1d776ec15daed1cd0442bd7aa55aa6096bbcc314c3e60ce5e0bdc4f985470464: Status 404 returned error can't find the container with id 1d776ec15daed1cd0442bd7aa55aa6096bbcc314c3e60ce5e0bdc4f985470464 Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.383021 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.411271 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.423365 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.424883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.445259 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.463136 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.479237 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.479711 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:49.979696171 +0000 UTC m=+152.860376162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.486173 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.502696 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.521767 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" event={"ID":"ed298584-c8e2-43c3-88f1-d95aea472e00","Type":"ContainerStarted","Data":"02167c05fdb5bb2363e8c6e9152d8df75427c6acd663b9a74e3f9cd092a7b1b6"} Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.524037 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" event={"ID":"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd","Type":"ContainerStarted","Data":"1d776ec15daed1cd0442bd7aa55aa6096bbcc314c3e60ce5e0bdc4f985470464"} Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.524046 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.526347 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" event={"ID":"ffa64292-b071-4bfc-93d6-70d65b00847d","Type":"ContainerStarted","Data":"a4a0a2f2a6bf305a68a4cde7a8b6a47b9469dc0d898d95913b4da9bb05026d05"} Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.539697 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8fxbs"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.543140 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.564453 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.587020 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.587426 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.087403147 +0000 UTC m=+152.968083148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.591558 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.603787 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.622192 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.645231 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.676148 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.678277 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6k2l"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.683405 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.684465 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjxz\" (UniqueName: \"kubernetes.io/projected/e7122fdb-f764-4228-9a2e-2c3aedd5b4fb-kube-api-access-cmjxz\") pod \"cluster-samples-operator-665b6dd947-pstpv\" (UID: \"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.688672 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.689165 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.189148736 +0000 UTC m=+153.069828737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.691304 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e20d7fd_98b9_4f31_aa29_b8069f8d2ea3.slice/crio-7a9eac6040542cd3214e032fd07199229a35eebae5fc9584bdd38ba686412918 WatchSource:0}: Error finding container 7a9eac6040542cd3214e032fd07199229a35eebae5fc9584bdd38ba686412918: Status 404 returned error can't find the container with id 7a9eac6040542cd3214e032fd07199229a35eebae5fc9584bdd38ba686412918 Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.701688 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.704878 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.714899 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e854b6_0bab_4aa3_9d60_97542cd304eb.slice/crio-3122f045d76f7c887bb00ef8223891509c09466698ff514bbf0dbe6e8e3ccf56 WatchSource:0}: Error finding container 3122f045d76f7c887bb00ef8223891509c09466698ff514bbf0dbe6e8e3ccf56: Status 404 returned error can't find the container with id 3122f045d76f7c887bb00ef8223891509c09466698ff514bbf0dbe6e8e3ccf56 Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.725465 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.731363 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8"] Dec 02 22:44:49 crc kubenswrapper[4696]: W1202 22:44:49.752583 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ef23a3_2b42_4fa9_8311_32f77bb7c8c0.slice/crio-9546eb4fb48fb0e7cc4b08c7aa264b846593cc964ddaeb8e82c13ae11a10ef7b WatchSource:0}: Error finding container 9546eb4fb48fb0e7cc4b08c7aa264b846593cc964ddaeb8e82c13ae11a10ef7b: Status 404 returned error can't find the container with id 9546eb4fb48fb0e7cc4b08c7aa264b846593cc964ddaeb8e82c13ae11a10ef7b Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.760390 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbnm\" (UniqueName: \"kubernetes.io/projected/88fbd4d8-6770-4812-aded-c20e16d0e24b-kube-api-access-tnbnm\") pod \"console-operator-58897d9998-zpshp\" (UID: \"88fbd4d8-6770-4812-aded-c20e16d0e24b\") " pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.769193 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.784019 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.789729 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.790166 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.290150824 +0000 UTC m=+153.170830825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.820300 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlj69\" (UniqueName: \"kubernetes.io/projected/3f45ac7c-8865-4924-8dbd-5826a21d028e-kube-api-access-jlj69\") pod \"downloads-7954f5f757-jhcjw\" (UID: \"3f45ac7c-8865-4924-8dbd-5826a21d028e\") " pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.846519 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.862181 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.866023 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmm4\" (UniqueName: \"kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4\") pod \"route-controller-manager-6576b87f9c-ph95v\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.885312 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.891580 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:49 crc kubenswrapper[4696]: E1202 22:44:49.892091 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.392071818 +0000 UTC m=+153.272751819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.902306 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.930760 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.930989 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.942340 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.952455 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.959382 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv"] Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.959600 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.962463 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 22:44:49 crc kubenswrapper[4696]: I1202 22:44:49.987583 4696 request.go:700] Waited for 1.00890671s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.000987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.001166 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.501135914 +0000 UTC m=+153.381815935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.001593 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.002054 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.502031831 +0000 UTC m=+153.382711832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.004193 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.010273 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhvw\" (UniqueName: \"kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw\") pod \"console-f9d7485db-f6wj6\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.022011 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.046250 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.062521 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.100371 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.102281 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.102732 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.602710178 +0000 UTC m=+153.483390179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.107297 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.122813 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.143089 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.166316 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.189144 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.203224 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.203669 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.703654484 +0000 UTC m=+153.584334485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.206536 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.214902 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zpshp"] Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.222406 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.223840 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.263766 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.276065 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.277479 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jhcjw"] Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.285992 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.304700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.305180 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.805157085 +0000 UTC m=+153.685837086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.305491 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.321586 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.344087 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.362383 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.383089 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.406582 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.406761 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.407011 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:50.906993287 +0000 UTC m=+153.787673278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.425083 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.442824 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.467665 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.489006 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.504692 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.507383 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.507803 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.007783438 +0000 UTC m=+153.888463459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.522419 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.543217 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.548721 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0" containerID="3fc9a5bf31b60329b152505d7066f511bcc814e237b197a25481f53bddfbe469" exitCode=0 Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.548860 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" event={"ID":"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0","Type":"ContainerDied","Data":"3fc9a5bf31b60329b152505d7066f511bcc814e237b197a25481f53bddfbe469"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.548908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" event={"ID":"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0","Type":"ContainerStarted","Data":"9546eb4fb48fb0e7cc4b08c7aa264b846593cc964ddaeb8e82c13ae11a10ef7b"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.556516 4696 generic.go:334] "Generic (PLEG): container finished" podID="9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3" containerID="e67073a9190974d392a7bd1a18ca189b8b443a94b8aa36172bf07033d678a57e" exitCode=0 Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.556857 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" event={"ID":"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3","Type":"ContainerDied","Data":"e67073a9190974d392a7bd1a18ca189b8b443a94b8aa36172bf07033d678a57e"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.556922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" event={"ID":"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3","Type":"ContainerStarted","Data":"7a9eac6040542cd3214e032fd07199229a35eebae5fc9584bdd38ba686412918"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.564733 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.565000 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" event={"ID":"ffa64292-b071-4bfc-93d6-70d65b00847d","Type":"ContainerStarted","Data":"4fc3619d08f2dd6e4deff4c42a5e90aa24521d272996c724b44d1cd6846afdd5"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.565069 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" event={"ID":"ffa64292-b071-4bfc-93d6-70d65b00847d","Type":"ContainerStarted","Data":"8cc6d33ad537b100e7fb7336cd0587a0788122d893dda95a10821d196b019774"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.580711 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" event={"ID":"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2","Type":"ContainerStarted","Data":"701323f2e949f39f8cec2f8adec773d55414c1c1d863d4da1224add9cce92b9a"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.582811 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" event={"ID":"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb","Type":"ContainerStarted","Data":"39a0b3040cc70d12b8d8bb2607e65dab6402f6dbe3186fcbcf2eb55c5511312d"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.586908 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.587757 4696 generic.go:334] "Generic (PLEG): container finished" podID="092e9b7e-6772-4cde-89b7-de81ae50222e" containerID="a6312ac6ca3d32d0d3b33a0f557e55738963d7c0a8aee381484175f6fcad7b17" exitCode=0 Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.588346 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" event={"ID":"092e9b7e-6772-4cde-89b7-de81ae50222e","Type":"ContainerDied","Data":"a6312ac6ca3d32d0d3b33a0f557e55738963d7c0a8aee381484175f6fcad7b17"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.588398 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" event={"ID":"092e9b7e-6772-4cde-89b7-de81ae50222e","Type":"ContainerStarted","Data":"59e7d4358cc787bc2480c6619d0390f66cb1b6f91cf44dacd26638e7ccaa1a6a"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.607206 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.609635 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.610106 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.110085764 +0000 UTC m=+153.990765765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.617355 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" event={"ID":"ed298584-c8e2-43c3-88f1-d95aea472e00","Type":"ContainerStarted","Data":"52a0525bc172c705018fb6a2e657eb98cc6dd61929298b95ba729a3abf034aaf"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.618994 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.621455 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" event={"ID":"93e854b6-0bab-4aa3-9d60-97542cd304eb","Type":"ContainerStarted","Data":"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.621521 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" event={"ID":"93e854b6-0bab-4aa3-9d60-97542cd304eb","Type":"ContainerStarted","Data":"3122f045d76f7c887bb00ef8223891509c09466698ff514bbf0dbe6e8e3ccf56"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.622114 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.622526 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.629546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zpshp" event={"ID":"88fbd4d8-6770-4812-aded-c20e16d0e24b","Type":"ContainerStarted","Data":"6d03a73f9281b129ad39507a6ac557fcad937dd2778821f640b35017e123eef6"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.630819 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.638952 4696 patch_prober.go:28] interesting pod/console-operator-58897d9998-zpshp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.639030 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zpshp" podUID="88fbd4d8-6770-4812-aded-c20e16d0e24b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.639427 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" event={"ID":"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd","Type":"ContainerStarted","Data":"5c5d2571cbf6118626ac9e73d31b8eb8974f0ebfb36b72119faace2881a0e442"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.639817 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.641866 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.650350 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.671139 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.671829 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jhcjw" event={"ID":"3f45ac7c-8865-4924-8dbd-5826a21d028e","Type":"ContainerStarted","Data":"b4c97aa3d87167eb7eda33da24a54e1ea405eeeb00f2ef432443a0385569e48e"} Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.672925 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.676542 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-jhcjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.676595 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jhcjw" podUID="3f45ac7c-8865-4924-8dbd-5826a21d028e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.684414 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.709720 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.710337 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.710674 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.210655008 +0000 UTC m=+154.091334999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.723737 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.748996 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.764925 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.792581 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.804382 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.812329 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.812998 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.312972345 +0000 UTC m=+154.193652346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.824371 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.842035 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.861842 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.882417 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.902950 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.913410 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:50 crc kubenswrapper[4696]: E1202 22:44:50.913876 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.413853148 +0000 UTC m=+154.294533149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.923155 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.941485 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.962760 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 22:44:50 crc kubenswrapper[4696]: I1202 22:44:50.982513 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.000117 4696 request.go:700] Waited for 1.907868578s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.002411 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.015568 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.016181 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.516040331 +0000 UTC m=+154.396720332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.024096 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.043104 4696 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.063011 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.085259 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.102817 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.116986 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.117269 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.617235704 +0000 UTC m=+154.497915705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.117411 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.117795 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.6177874 +0000 UTC m=+154.498467401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.162919 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.196767 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnmvm\" (UniqueName: \"kubernetes.io/projected/6c10678a-82ec-492c-b8c9-e1689fb7c63b-kube-api-access-wnmvm\") pod \"cluster-image-registry-operator-dc59b4c8b-dkb9g\" (UID: \"6c10678a-82ec-492c-b8c9-e1689fb7c63b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.204510 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvxk\" (UniqueName: \"kubernetes.io/projected/5f5d77f1-55f7-49d2-b5db-19d7150b882f-kube-api-access-hnvxk\") pod \"machine-approver-56656f9798-4cx4s\" (UID: \"5f5d77f1-55f7-49d2-b5db-19d7150b882f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.218608 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.218849 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.718816788 +0000 UTC m=+154.599496789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.218924 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.219299 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.719290922 +0000 UTC m=+154.599970923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.229863 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.230627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkj5\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.242896 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: W1202 22:44:51.284252 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f5d77f1_55f7_49d2_b5db_19d7150b882f.slice/crio-eef53446872d89db697ea1d91b6a5d9ad938d27ea453282dc6a0db05b0b4a790 WatchSource:0}: Error finding container eef53446872d89db697ea1d91b6a5d9ad938d27ea453282dc6a0db05b0b4a790: Status 404 returned error can't find the container with id eef53446872d89db697ea1d91b6a5d9ad938d27ea453282dc6a0db05b0b4a790 Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320238 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320536 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-profile-collector-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320564 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2s2\" (UniqueName: \"kubernetes.io/projected/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-kube-api-access-bj2s2\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6lq\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-kube-api-access-fj6lq\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320651 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d79d46-f5a5-42d4-8000-61e91b72e0f7-cert\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320670 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-882hk\" (UniqueName: \"kubernetes.io/projected/7fe15835-a31c-46df-aef5-21aade83fa88-kube-api-access-882hk\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320698 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-plugins-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320806 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnp9j\" (UniqueName: \"kubernetes.io/projected/1198dca5-128d-4625-95e1-2d9d9a86b263-kube-api-access-bnp9j\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320832 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-node-bootstrap-token\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320867 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks74t\" (UniqueName: \"kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-client\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320914 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vw5c\" (UniqueName: \"kubernetes.io/projected/96c61760-b182-43b4-a0d8-a461ba742b85-kube-api-access-8vw5c\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320929 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0f1308-0862-4ac2-b741-849ae33c2776-service-ca-bundle\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320944 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzm8\" (UniqueName: \"kubernetes.io/projected/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-kube-api-access-qwzm8\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.320995 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-service-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321012 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps879\" (UniqueName: \"kubernetes.io/projected/3c4885ba-062b-41c3-be71-997c9781e537-kube-api-access-ps879\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321057 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbh9c\" (UniqueName: \"kubernetes.io/projected/4feef0d1-3fad-4990-8d96-65beb52e89b3-kube-api-access-qbh9c\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts4lb\" (UniqueName: \"kubernetes.io/projected/54d79d46-f5a5-42d4-8000-61e91b72e0f7-kube-api-access-ts4lb\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321098 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02277212-4ddf-49e2-8ac0-58c20fd973a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321115 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321144 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-key\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321160 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-serving-cert\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321197 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk2cv\" (UniqueName: \"kubernetes.io/projected/3a98fb36-4189-4e59-b07a-b42a8aaee322-kube-api-access-tk2cv\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321214 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9n2z\" (UniqueName: \"kubernetes.io/projected/5b22a349-1f5e-49f6-982f-e192aed0933d-kube-api-access-h9n2z\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2hg\" (UniqueName: \"kubernetes.io/projected/ed9ebc4a-9236-49f7-b365-79a6890e2bc8-kube-api-access-zn2hg\") pod \"migrator-59844c95c7-qghph\" (UID: \"ed9ebc4a-9236-49f7-b365-79a6890e2bc8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321257 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f72e09f8-e3a5-4434-b1b7-3978e84c472a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321273 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-default-certificate\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321291 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321308 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4dw\" (UniqueName: \"kubernetes.io/projected/fb0f1308-0862-4ac2-b741-849ae33c2776-kube-api-access-2x4dw\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-srv-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-webhook-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321409 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8qs5\" (UniqueName: \"kubernetes.io/projected/a4ed1132-b227-4d20-9ca7-7530ff4c0163-kube-api-access-q8qs5\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321448 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94966c3d-5233-480c-a199-6813c47a1e04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gkqm\" (UniqueName: \"kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321526 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238dedc6-7624-4ad3-9e79-4d536ac8acda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321544 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-metrics-certs\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321565 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321583 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02277212-4ddf-49e2-8ac0-58c20fd973a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321645 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vw6\" (UniqueName: \"kubernetes.io/projected/238dedc6-7624-4ad3-9e79-4d536ac8acda-kube-api-access-55vw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321662 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-stats-auth\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b22a349-1f5e-49f6-982f-e192aed0933d-config-volume\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321750 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3992f4d2-df09-400c-a4a9-516a3c95514a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321768 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321783 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321800 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-metrics-tls\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321818 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7893eeb2-3d1b-4e3d-8752-438c53238019-proxy-tls\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321835 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b22a349-1f5e-49f6-982f-e192aed0933d-metrics-tls\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321863 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-certs\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-apiservice-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321914 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238dedc6-7624-4ad3-9e79-4d536ac8acda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321949 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-srv-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321965 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-registration-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.321982 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ed1132-b227-4d20-9ca7-7530ff4c0163-serving-cert\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322007 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4feef0d1-3fad-4990-8d96-65beb52e89b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322033 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-csi-data-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322058 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-images\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322074 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-cabundle\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322094 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9prf\" (UniqueName: \"kubernetes.io/projected/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-kube-api-access-k9prf\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322173 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b7406ed-3f1e-476e-9d81-a0612ad16002-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322229 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-config\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322281 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-profile-collector-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322299 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4feef0d1-3fad-4990-8d96-65beb52e89b3-proxy-tls\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322316 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3992f4d2-df09-400c-a4a9-516a3c95514a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322331 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a98fb36-4189-4e59-b07a-b42a8aaee322-tmpfs\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322347 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4885ba-062b-41c3-be71-997c9781e537-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322383 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49d79\" (UniqueName: \"kubernetes.io/projected/7893eeb2-3d1b-4e3d-8752-438c53238019-kube-api-access-49d79\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7406ed-3f1e-476e-9d81-a0612ad16002-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322438 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3992f4d2-df09-400c-a4a9-516a3c95514a-config\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nqs9\" (UniqueName: \"kubernetes.io/projected/f72e09f8-e3a5-4434-b1b7-3978e84c472a-kube-api-access-2nqs9\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322473 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-socket-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322516 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322532 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322548 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4885ba-062b-41c3-be71-997c9781e537-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkksh\" (UniqueName: \"kubernetes.io/projected/32ed738d-b1d3-4966-a487-1c2aa92c6f20-kube-api-access-dkksh\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322600 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2s5q\" (UniqueName: \"kubernetes.io/projected/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-kube-api-access-c2s5q\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe15835-a31c-46df-aef5-21aade83fa88-serving-cert\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322633 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-config\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322676 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8mtg\" (UniqueName: \"kubernetes.io/projected/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-kube-api-access-n8mtg\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322724 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnx6t\" (UniqueName: \"kubernetes.io/projected/94966c3d-5233-480c-a199-6813c47a1e04-kube-api-access-mnx6t\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322754 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed1132-b227-4d20-9ca7-7530ff4c0163-config\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322814 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-config\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322851 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7406ed-3f1e-476e-9d81-a0612ad16002-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322867 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.322884 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-mountpoint-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.323001 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.822983949 +0000 UTC m=+154.703663950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b22a349-1f5e-49f6-982f-e192aed0933d-config-volume\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424642 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424662 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424683 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3992f4d2-df09-400c-a4a9-516a3c95514a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424699 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-metrics-tls\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424715 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7893eeb2-3d1b-4e3d-8752-438c53238019-proxy-tls\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424767 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b22a349-1f5e-49f6-982f-e192aed0933d-metrics-tls\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424789 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-certs\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424807 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-apiservice-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424828 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238dedc6-7624-4ad3-9e79-4d536ac8acda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424846 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-srv-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424861 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-registration-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424879 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ed1132-b227-4d20-9ca7-7530ff4c0163-serving-cert\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4feef0d1-3fad-4990-8d96-65beb52e89b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424912 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-csi-data-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424928 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-images\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424944 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-cabundle\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9prf\" (UniqueName: \"kubernetes.io/projected/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-kube-api-access-k9prf\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424981 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.424998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b7406ed-3f1e-476e-9d81-a0612ad16002-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425016 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-config\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425033 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-profile-collector-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425051 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4feef0d1-3fad-4990-8d96-65beb52e89b3-proxy-tls\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425071 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4885ba-062b-41c3-be71-997c9781e537-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425089 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3992f4d2-df09-400c-a4a9-516a3c95514a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425109 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a98fb36-4189-4e59-b07a-b42a8aaee322-tmpfs\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425128 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49d79\" (UniqueName: \"kubernetes.io/projected/7893eeb2-3d1b-4e3d-8752-438c53238019-kube-api-access-49d79\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425144 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7406ed-3f1e-476e-9d81-a0612ad16002-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425162 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3992f4d2-df09-400c-a4a9-516a3c95514a-config\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425183 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nqs9\" (UniqueName: \"kubernetes.io/projected/f72e09f8-e3a5-4434-b1b7-3978e84c472a-kube-api-access-2nqs9\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-socket-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425220 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425237 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4885ba-062b-41c3-be71-997c9781e537-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425256 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkksh\" (UniqueName: \"kubernetes.io/projected/32ed738d-b1d3-4966-a487-1c2aa92c6f20-kube-api-access-dkksh\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425292 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2s5q\" (UniqueName: \"kubernetes.io/projected/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-kube-api-access-c2s5q\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe15835-a31c-46df-aef5-21aade83fa88-serving-cert\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425327 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-config\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425344 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8mtg\" (UniqueName: \"kubernetes.io/projected/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-kube-api-access-n8mtg\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425361 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnx6t\" (UniqueName: \"kubernetes.io/projected/94966c3d-5233-480c-a199-6813c47a1e04-kube-api-access-mnx6t\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425354 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425380 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed1132-b227-4d20-9ca7-7530ff4c0163-config\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425476 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-config\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425507 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7406ed-3f1e-476e-9d81-a0612ad16002-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425555 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-mountpoint-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425602 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-profile-collector-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425624 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6lq\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-kube-api-access-fj6lq\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425649 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2s2\" (UniqueName: \"kubernetes.io/projected/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-kube-api-access-bj2s2\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425703 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-882hk\" (UniqueName: \"kubernetes.io/projected/7fe15835-a31c-46df-aef5-21aade83fa88-kube-api-access-882hk\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425725 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d79d46-f5a5-42d4-8000-61e91b72e0f7-cert\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425780 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-plugins-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425812 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnp9j\" (UniqueName: \"kubernetes.io/projected/1198dca5-128d-4625-95e1-2d9d9a86b263-kube-api-access-bnp9j\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425831 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-node-bootstrap-token\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425868 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425893 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks74t\" (UniqueName: \"kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425919 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-client\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425947 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vw5c\" (UniqueName: \"kubernetes.io/projected/96c61760-b182-43b4-a0d8-a461ba742b85-kube-api-access-8vw5c\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425975 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0f1308-0862-4ac2-b741-849ae33c2776-service-ca-bundle\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.425998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzm8\" (UniqueName: \"kubernetes.io/projected/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-kube-api-access-qwzm8\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426028 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps879\" (UniqueName: \"kubernetes.io/projected/3c4885ba-062b-41c3-be71-997c9781e537-kube-api-access-ps879\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426053 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-service-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426082 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbh9c\" (UniqueName: \"kubernetes.io/projected/4feef0d1-3fad-4990-8d96-65beb52e89b3-kube-api-access-qbh9c\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426107 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts4lb\" (UniqueName: \"kubernetes.io/projected/54d79d46-f5a5-42d4-8000-61e91b72e0f7-kube-api-access-ts4lb\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426129 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02277212-4ddf-49e2-8ac0-58c20fd973a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426161 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426202 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-key\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-serving-cert\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426219 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-config\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk2cv\" (UniqueName: \"kubernetes.io/projected/3a98fb36-4189-4e59-b07a-b42a8aaee322-kube-api-access-tk2cv\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426322 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9n2z\" (UniqueName: \"kubernetes.io/projected/5b22a349-1f5e-49f6-982f-e192aed0933d-kube-api-access-h9n2z\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426355 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f72e09f8-e3a5-4434-b1b7-3978e84c472a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426382 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-default-certificate\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426448 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2hg\" (UniqueName: \"kubernetes.io/projected/ed9ebc4a-9236-49f7-b365-79a6890e2bc8-kube-api-access-zn2hg\") pod \"migrator-59844c95c7-qghph\" (UID: \"ed9ebc4a-9236-49f7-b365-79a6890e2bc8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426474 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-srv-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-webhook-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426521 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4dw\" (UniqueName: \"kubernetes.io/projected/fb0f1308-0862-4ac2-b741-849ae33c2776-kube-api-access-2x4dw\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426636 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94966c3d-5233-480c-a199-6813c47a1e04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426670 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8qs5\" (UniqueName: \"kubernetes.io/projected/a4ed1132-b227-4d20-9ca7-7530ff4c0163-kube-api-access-q8qs5\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426698 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gkqm\" (UniqueName: \"kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426726 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238dedc6-7624-4ad3-9e79-4d536ac8acda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426764 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-metrics-certs\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426785 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426806 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426826 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02277212-4ddf-49e2-8ac0-58c20fd973a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426847 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vw6\" (UniqueName: \"kubernetes.io/projected/238dedc6-7624-4ad3-9e79-4d536ac8acda-kube-api-access-55vw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.426863 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-stats-auth\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.428487 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fe15835-a31c-46df-aef5-21aade83fa88-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.428530 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b22a349-1f5e-49f6-982f-e192aed0933d-config-volume\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.431059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-stats-auth\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.431829 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-mountpoint-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.434360 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-metrics-certs\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.435376 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-profile-collector-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.436200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f72e09f8-e3a5-4434-b1b7-3978e84c472a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.436498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-plugins-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.436632 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/238dedc6-7624-4ad3-9e79-4d536ac8acda-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.437072 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238dedc6-7624-4ad3-9e79-4d536ac8acda-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.437898 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0f1308-0862-4ac2-b741-849ae33c2776-service-ca-bundle\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.438137 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a98fb36-4189-4e59-b07a-b42a8aaee322-tmpfs\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.438674 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:51.93865731 +0000 UTC m=+154.819337311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.438704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.438935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7406ed-3f1e-476e-9d81-a0612ad16002-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.439274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b22a349-1f5e-49f6-982f-e192aed0933d-metrics-tls\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.439512 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3992f4d2-df09-400c-a4a9-516a3c95514a-config\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.439664 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-socket-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.440188 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/96c61760-b182-43b4-a0d8-a461ba742b85-srv-cert\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.440283 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-registration-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.441702 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7893eeb2-3d1b-4e3d-8752-438c53238019-proxy-tls\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.443575 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-metrics-tls\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.444772 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.445932 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.447015 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.448922 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-csi-data-dir\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.449343 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.449770 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-service-ca\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.450092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4feef0d1-3fad-4990-8d96-65beb52e89b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.451948 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-node-bootstrap-token\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.452187 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-cabundle\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.452540 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4885ba-062b-41c3-be71-997c9781e537-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.452912 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe15835-a31c-46df-aef5-21aade83fa88-serving-cert\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453069 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb0f1308-0862-4ac2-b741-849ae33c2776-default-certificate\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453094 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4885ba-062b-41c3-be71-997c9781e537-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453247 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d79d46-f5a5-42d4-8000-61e91b72e0f7-cert\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453724 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ed1132-b227-4d20-9ca7-7530ff4c0163-serving-cert\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453843 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-webhook-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453855 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-config\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.453990 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ed1132-b227-4d20-9ca7-7530ff4c0163-config\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.454240 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.454621 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.454941 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-etcd-client\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.455139 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-srv-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.455297 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.455302 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7406ed-3f1e-476e-9d81-a0612ad16002-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.455374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a98fb36-4189-4e59-b07a-b42a8aaee322-apiservice-cert\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.456369 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94966c3d-5233-480c-a199-6813c47a1e04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.456683 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-config\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.456789 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7893eeb2-3d1b-4e3d-8752-438c53238019-images\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.457445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4feef0d1-3fad-4990-8d96-65beb52e89b3-proxy-tls\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.457530 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/32ed738d-b1d3-4966-a487-1c2aa92c6f20-profile-collector-cert\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.459990 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.460514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-serving-cert\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.463445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1198dca5-128d-4625-95e1-2d9d9a86b263-signing-key\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.473803 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3992f4d2-df09-400c-a4a9-516a3c95514a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.505074 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02277212-4ddf-49e2-8ac0-58c20fd973a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.508646 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02277212-4ddf-49e2-8ac0-58c20fd973a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.514365 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-certs\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.522683 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3992f4d2-df09-400c-a4a9-516a3c95514a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vz6p\" (UID: \"3992f4d2-df09-400c-a4a9-516a3c95514a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.524384 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk2cv\" (UniqueName: \"kubernetes.io/projected/3a98fb36-4189-4e59-b07a-b42a8aaee322-kube-api-access-tk2cv\") pod \"packageserver-d55dfcdfc-v2rkn\" (UID: \"3a98fb36-4189-4e59-b07a-b42a8aaee322\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.527105 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/104b39f4-a0ab-4b71-a5d9-7c2cafd48976-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qxljp\" (UID: \"104b39f4-a0ab-4b71-a5d9-7c2cafd48976\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.527648 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.528037 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.027994893 +0000 UTC m=+154.908674894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.529924 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.530318 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.030303621 +0000 UTC m=+154.910983622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.547330 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8qs5\" (UniqueName: \"kubernetes.io/projected/a4ed1132-b227-4d20-9ca7-7530ff4c0163-kube-api-access-q8qs5\") pod \"service-ca-operator-777779d784-jwz6f\" (UID: \"a4ed1132-b227-4d20-9ca7-7530ff4c0163\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.572166 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.573872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gkqm\" (UniqueName: \"kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm\") pod \"marketplace-operator-79b997595-dtrjk\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.584333 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks74t\" (UniqueName: \"kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t\") pod \"collect-profiles-29411910-cc46z\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.609950 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.624826 4696 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v6k2l container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.624894 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.630545 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.631202 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.631710 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.13169104 +0000 UTC m=+155.012371041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.657482 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2hg\" (UniqueName: \"kubernetes.io/projected/ed9ebc4a-9236-49f7-b365-79a6890e2bc8-kube-api-access-zn2hg\") pod \"migrator-59844c95c7-qghph\" (UID: \"ed9ebc4a-9236-49f7-b365-79a6890e2bc8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.662552 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9n2z\" (UniqueName: \"kubernetes.io/projected/5b22a349-1f5e-49f6-982f-e192aed0933d-kube-api-access-h9n2z\") pod \"dns-default-4jgzb\" (UID: \"5b22a349-1f5e-49f6-982f-e192aed0933d\") " pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.678399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnp9j\" (UniqueName: \"kubernetes.io/projected/1198dca5-128d-4625-95e1-2d9d9a86b263-kube-api-access-bnp9j\") pod \"service-ca-9c57cc56f-6bghq\" (UID: \"1198dca5-128d-4625-95e1-2d9d9a86b263\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.693093 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.701820 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.709502 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzm8\" (UniqueName: \"kubernetes.io/projected/e1e19685-0a6c-47e3-b4d6-b98aa6ca63af-kube-api-access-qwzm8\") pod \"csi-hostpathplugin-2jhb9\" (UID: \"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af\") " pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.712424 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbh9c\" (UniqueName: \"kubernetes.io/projected/4feef0d1-3fad-4990-8d96-65beb52e89b3-kube-api-access-qbh9c\") pod \"machine-config-controller-84d6567774-5rtdf\" (UID: \"4feef0d1-3fad-4990-8d96-65beb52e89b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.722085 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" event={"ID":"092e9b7e-6772-4cde-89b7-de81ae50222e","Type":"ContainerStarted","Data":"acb5b3d7831560d5d15e9a7914e1b5703bf67f86842ddb708ea1a38a7889a10f"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.730507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.733144 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.733536 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.233521881 +0000 UTC m=+155.114201882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.741952 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" event={"ID":"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2","Type":"ContainerStarted","Data":"690c855a032ba55ea7b7e1f0d54f787b71f42f38d5b836500381207da8fbf237"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.742685 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.743099 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.755938 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vw5c\" (UniqueName: \"kubernetes.io/projected/96c61760-b182-43b4-a0d8-a461ba742b85-kube-api-access-8vw5c\") pod \"olm-operator-6b444d44fb-58dmz\" (UID: \"96c61760-b182-43b4-a0d8-a461ba742b85\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.758872 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.763655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jhcjw" event={"ID":"3f45ac7c-8865-4924-8dbd-5826a21d028e","Type":"ContainerStarted","Data":"ce1263c43d123cb1d09084ae722ae0380822e257eff19dbe063a0dfc53bf9828"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.766479 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-jhcjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.766518 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jhcjw" podUID="3f45ac7c-8865-4924-8dbd-5826a21d028e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.767224 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts4lb\" (UniqueName: \"kubernetes.io/projected/54d79d46-f5a5-42d4-8000-61e91b72e0f7-kube-api-access-ts4lb\") pod \"ingress-canary-6s4j2\" (UID: \"54d79d46-f5a5-42d4-8000-61e91b72e0f7\") " pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.780060 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.780649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" event={"ID":"5f5d77f1-55f7-49d2-b5db-19d7150b882f","Type":"ContainerStarted","Data":"eef53446872d89db697ea1d91b6a5d9ad938d27ea453282dc6a0db05b0b4a790"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.780931 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.784937 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2s2\" (UniqueName: \"kubernetes.io/projected/6ba68eb4-959e-4a8a-a35e-f50abaef4cf9-kube-api-access-bj2s2\") pod \"machine-config-server-jtr26\" (UID: \"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9\") " pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.786848 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6lq\" (UniqueName: \"kubernetes.io/projected/02277212-4ddf-49e2-8ac0-58c20fd973a3-kube-api-access-fj6lq\") pod \"ingress-operator-5b745b69d9-j5tcx\" (UID: \"02277212-4ddf-49e2-8ac0-58c20fd973a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.792013 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" event={"ID":"b1ef23a3-2b42-4fa9-8311-32f77bb7c8c0","Type":"ContainerStarted","Data":"641a15b56a16b0951f2fa7dfca81107dfe5daa337490918c7d28672d5c8360d0"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.792052 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.798628 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49d79\" (UniqueName: \"kubernetes.io/projected/7893eeb2-3d1b-4e3d-8752-438c53238019-kube-api-access-49d79\") pod \"machine-config-operator-74547568cd-jtktn\" (UID: \"7893eeb2-3d1b-4e3d-8752-438c53238019\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.799597 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" event={"ID":"9e20d7fd-98b9-4f31-aa29-b8069f8d2ea3","Type":"ContainerStarted","Data":"1c31767b0cb73ecf56419aee927995cfb1c683790af30582dc3b86bffab198a6"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.800709 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zpshp" event={"ID":"88fbd4d8-6770-4812-aded-c20e16d0e24b","Type":"ContainerStarted","Data":"e70f85fee221c9bd4b7ad292757d4e6573811fb2e52947ae4a0f2ff3bfa090cd"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.805345 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zpshp" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.820998 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f6wj6" event={"ID":"cab80860-b375-43ce-9df7-16ed59a8247a","Type":"ContainerStarted","Data":"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.821044 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f6wj6" event={"ID":"cab80860-b375-43ce-9df7-16ed59a8247a","Type":"ContainerStarted","Data":"7672f2c34177fee4a6f5508e91fcd14d12c64a1d94145d6f411daccf3c4cdc74"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.831237 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6s4j2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.835075 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.836697 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.336678782 +0000 UTC m=+155.217358783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.843521 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-882hk\" (UniqueName: \"kubernetes.io/projected/7fe15835-a31c-46df-aef5-21aade83fa88-kube-api-access-882hk\") pod \"authentication-operator-69f744f599-q5df2\" (UID: \"7fe15835-a31c-46df-aef5-21aade83fa88\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.845683 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.847084 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" event={"ID":"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb","Type":"ContainerStarted","Data":"114ccb7472aabcf416c8334aa4d7926f730667a4a135bf96c8b99f02c11e8c67"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.847117 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" event={"ID":"e7122fdb-f764-4228-9a2e-2c3aedd5b4fb","Type":"ContainerStarted","Data":"36b11f0fded4fbc381abc960ad371dae5759d7819953a97c1c5c3b1acee8ccdf"} Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.851849 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.854839 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nqs9\" (UniqueName: \"kubernetes.io/projected/f72e09f8-e3a5-4434-b1b7-3978e84c472a-kube-api-access-2nqs9\") pod \"multus-admission-controller-857f4d67dd-6xbr4\" (UID: \"f72e09f8-e3a5-4434-b1b7-3978e84c472a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.871761 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps879\" (UniqueName: \"kubernetes.io/projected/3c4885ba-062b-41c3-be71-997c9781e537-kube-api-access-ps879\") pod \"openshift-controller-manager-operator-756b6f6bc6-p6njc\" (UID: \"3c4885ba-062b-41c3-be71-997c9781e537\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.887554 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.902322 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.907552 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vw6\" (UniqueName: \"kubernetes.io/projected/238dedc6-7624-4ad3-9e79-4d536ac8acda-kube-api-access-55vw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kgqqs\" (UID: \"238dedc6-7624-4ad3-9e79-4d536ac8acda\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.923061 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.939709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.940502 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkksh\" (UniqueName: \"kubernetes.io/projected/32ed738d-b1d3-4966-a487-1c2aa92c6f20-kube-api-access-dkksh\") pod \"catalog-operator-68c6474976-984hk\" (UID: \"32ed738d-b1d3-4966-a487-1c2aa92c6f20\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: E1202 22:44:51.941437 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.44142271 +0000 UTC m=+155.322102711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.941434 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8mtg\" (UniqueName: \"kubernetes.io/projected/e9c0bdb8-cf1f-408c-b341-afb3fea179dd-kube-api-access-n8mtg\") pod \"dns-operator-744455d44c-bhp72\" (UID: \"e9c0bdb8-cf1f-408c-b341-afb3fea179dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.946602 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.958100 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.959374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4dw\" (UniqueName: \"kubernetes.io/projected/fb0f1308-0862-4ac2-b741-849ae33c2776-kube-api-access-2x4dw\") pod \"router-default-5444994796-qpwvl\" (UID: \"fb0f1308-0862-4ac2-b741-849ae33c2776\") " pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.959552 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" Dec 02 22:44:51 crc kubenswrapper[4696]: I1202 22:44:51.969459 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.006240 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.021316 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.023554 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2s5q\" (UniqueName: \"kubernetes.io/projected/772fa8b4-e1ff-40c2-aaaa-1b5c12143dac-kube-api-access-c2s5q\") pod \"etcd-operator-b45778765-k7pzs\" (UID: \"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.025182 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9prf\" (UniqueName: \"kubernetes.io/projected/bfd55522-63bd-40f3-a429-eb0c85fe5b9c-kube-api-access-k9prf\") pod \"control-plane-machine-set-operator-78cbb6b69f-58dc7\" (UID: \"bfd55522-63bd-40f3-a429-eb0c85fe5b9c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.033865 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b7406ed-3f1e-476e-9d81-a0612ad16002-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw26l\" (UID: \"1b7406ed-3f1e-476e-9d81-a0612ad16002\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.040185 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnx6t\" (UniqueName: \"kubernetes.io/projected/94966c3d-5233-480c-a199-6813c47a1e04-kube-api-access-mnx6t\") pod \"package-server-manager-789f6589d5-6vtkj\" (UID: \"94966c3d-5233-480c-a199-6813c47a1e04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.041105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.041560 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.541534931 +0000 UTC m=+155.422214932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.085565 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jtr26" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.129489 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.144331 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.146128 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.146798 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.646771854 +0000 UTC m=+155.527451855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.165708 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.180366 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.196198 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.217247 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.237021 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.256974 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.257452 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.757431267 +0000 UTC m=+155.638111268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.257625 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.260258 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.277135 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.363888 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.364728 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.86470242 +0000 UTC m=+155.745382421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.394310 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" podStartSLOduration=132.394275034 podStartE2EDuration="2m12.394275034s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.355516218 +0000 UTC m=+155.236196219" watchObservedRunningTime="2025-12-02 22:44:52.394275034 +0000 UTC m=+155.274955035" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.395255 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bghq"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.463768 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vt2b7" podStartSLOduration=133.463723478 podStartE2EDuration="2m13.463723478s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.462351028 +0000 UTC m=+155.343031039" watchObservedRunningTime="2025-12-02 22:44:52.463723478 +0000 UTC m=+155.344403479" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.465572 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.466038 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:52.965994796 +0000 UTC m=+155.846674797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.498428 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.527688 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.563325 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.567342 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.567691 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.067678153 +0000 UTC m=+155.948358154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.661171 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.669561 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.669989 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.169971969 +0000 UTC m=+156.050651970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.694679 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.765242 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" podStartSLOduration=133.765218616 podStartE2EDuration="2m13.765218616s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.762293479 +0000 UTC m=+155.642973480" watchObservedRunningTime="2025-12-02 22:44:52.765218616 +0000 UTC m=+155.645898617" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.772407 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.774100 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.274082798 +0000 UTC m=+156.154762809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.789234 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4j2"] Dec 02 22:44:52 crc kubenswrapper[4696]: W1202 22:44:52.807844 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb82b3f2_cd94_42ef_a662_8a4b4c8fac85.slice/crio-12099e6fd7c79da47a5cdbe285bb618c85b43504b6c2e2eaf3bda96deae11f2e WatchSource:0}: Error finding container 12099e6fd7c79da47a5cdbe285bb618c85b43504b6c2e2eaf3bda96deae11f2e: Status 404 returned error can't find the container with id 12099e6fd7c79da47a5cdbe285bb618c85b43504b6c2e2eaf3bda96deae11f2e Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.842895 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-f6wj6" podStartSLOduration=133.842875613 podStartE2EDuration="2m13.842875613s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.794574104 +0000 UTC m=+155.675254105" watchObservedRunningTime="2025-12-02 22:44:52.842875613 +0000 UTC m=+155.723555614" Dec 02 22:44:52 crc kubenswrapper[4696]: W1202 22:44:52.855479 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a98fb36_4189_4e59_b07a_b42a8aaee322.slice/crio-e88605d5d44ca0ca3d2e352b1a2c9050f188779b32665c19c9bfb3e00beab99d WatchSource:0}: Error finding container e88605d5d44ca0ca3d2e352b1a2c9050f188779b32665c19c9bfb3e00beab99d: Status 404 returned error can't find the container with id e88605d5d44ca0ca3d2e352b1a2c9050f188779b32665c19c9bfb3e00beab99d Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.876727 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.877242 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.377225369 +0000 UTC m=+156.257905370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.889752 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" podStartSLOduration=132.889715018 podStartE2EDuration="2m12.889715018s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.843495281 +0000 UTC m=+155.724175282" watchObservedRunningTime="2025-12-02 22:44:52.889715018 +0000 UTC m=+155.770395019" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.890927 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pstpv" podStartSLOduration=133.890919064 podStartE2EDuration="2m13.890919064s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:52.889218943 +0000 UTC m=+155.769898944" watchObservedRunningTime="2025-12-02 22:44:52.890919064 +0000 UTC m=+155.771599065" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.919916 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" event={"ID":"104b39f4-a0ab-4b71-a5d9-7c2cafd48976","Type":"ContainerStarted","Data":"be4cde4138a23a808bf8aa16994d4508d276b3491c47d8ed3a07acd3b81a363b"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.940932 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" event={"ID":"6c10678a-82ec-492c-b8c9-e1689fb7c63b","Type":"ContainerStarted","Data":"f7e34991e8f3727bae63e659d7cd09933d50575b2ab5e5a6004f278dc80c0a0c"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.940979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" event={"ID":"6c10678a-82ec-492c-b8c9-e1689fb7c63b","Type":"ContainerStarted","Data":"447814b0d49e9d795c2d96abc5f7debf8d1618d7fdb6dd962eb85bc7a239399a"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.967012 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx"] Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.971016 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" event={"ID":"3992f4d2-df09-400c-a4a9-516a3c95514a","Type":"ContainerStarted","Data":"bfd2f7eeabd5e0304a19a93368c5347f077b5d6f90cd6a9a341cb8dba2e27385"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.971595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" event={"ID":"3992f4d2-df09-400c-a4a9-516a3c95514a","Type":"ContainerStarted","Data":"1e6cd745d30921a100c5d18ec7e26baef6e7b159ddbf89d9ee5bbd2ff48ff8f4"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.974483 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.974796 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.992967 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:52 crc kubenswrapper[4696]: E1202 22:44:52.994555 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.494540878 +0000 UTC m=+156.375220879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.997119 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" event={"ID":"a4ed1132-b227-4d20-9ca7-7530ff4c0163","Type":"ContainerStarted","Data":"bc992609be63eb01c28b55892bb7cd86eb2103b469e802346b4be382f3d45f9c"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.998239 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" event={"ID":"1198dca5-128d-4625-95e1-2d9d9a86b263","Type":"ContainerStarted","Data":"6200e2d33492fd69234cb56dc189a50f3b5b8a8fe8b3f7748b572ed4ebf096a8"} Dec 02 22:44:52 crc kubenswrapper[4696]: I1202 22:44:52.998981 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" event={"ID":"db82b3f2-cd94-42ef-a662-8a4b4c8fac85","Type":"ContainerStarted","Data":"12099e6fd7c79da47a5cdbe285bb618c85b43504b6c2e2eaf3bda96deae11f2e"} Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.027967 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" event={"ID":"092e9b7e-6772-4cde-89b7-de81ae50222e","Type":"ContainerStarted","Data":"0c8e03d3a1974cf47f08418778bd87885fa2e266d83fac913affad078545f3e7"} Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.050453 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" event={"ID":"5f5d77f1-55f7-49d2-b5db-19d7150b882f","Type":"ContainerStarted","Data":"d0a63f558f2f26b988b8502e516912e5d629b1162637292274c88fa00c2d7bce"} Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.052780 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-jhcjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.052984 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jhcjw" podUID="3f45ac7c-8865-4924-8dbd-5826a21d028e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.095848 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.097449 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.597432422 +0000 UTC m=+156.478112423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.178697 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-rttcw" podStartSLOduration=133.178676054 podStartE2EDuration="2m13.178676054s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.138120585 +0000 UTC m=+156.018800586" watchObservedRunningTime="2025-12-02 22:44:53.178676054 +0000 UTC m=+156.059356055" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.197630 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.202659 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.702643083 +0000 UTC m=+156.583323084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.232069 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jhcjw" podStartSLOduration=134.232045683 podStartE2EDuration="2m14.232045683s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.181492728 +0000 UTC m=+156.062172749" watchObservedRunningTime="2025-12-02 22:44:53.232045683 +0000 UTC m=+156.112725684" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.304616 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.304954 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.804919008 +0000 UTC m=+156.685599009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.308605 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.309013 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.808999949 +0000 UTC m=+156.689679950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.414525 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.415138 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.915109058 +0000 UTC m=+156.795789059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.415270 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.415715 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:53.915698675 +0000 UTC m=+156.796378686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.516518 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.516806 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.016789775 +0000 UTC m=+156.897469776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.591114 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" podStartSLOduration=134.591094753 podStartE2EDuration="2m14.591094753s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.578900712 +0000 UTC m=+156.459580713" watchObservedRunningTime="2025-12-02 22:44:53.591094753 +0000 UTC m=+156.471774754" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.617920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.618326 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.118307347 +0000 UTC m=+156.998987348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.671807 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zpshp" podStartSLOduration=134.662726111 podStartE2EDuration="2m14.662726111s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.658245769 +0000 UTC m=+156.538925790" watchObservedRunningTime="2025-12-02 22:44:53.662726111 +0000 UTC m=+156.543406112" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.718820 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.719054 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.219027096 +0000 UTC m=+157.099707097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.723166 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.223111837 +0000 UTC m=+157.103791838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.719801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.750910 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc"] Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.808805 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xbr4"] Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.824582 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l"] Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.836039 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.836399 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.336375547 +0000 UTC m=+157.217055548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.904823 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vz6p" podStartSLOduration=133.90479308 podStartE2EDuration="2m13.90479308s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.896445003 +0000 UTC m=+156.777125004" watchObservedRunningTime="2025-12-02 22:44:53.90479308 +0000 UTC m=+156.785473081" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.937272 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:53 crc kubenswrapper[4696]: E1202 22:44:53.937694 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.437678612 +0000 UTC m=+157.318358613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.997734 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkb9g" podStartSLOduration=133.997709108 podStartE2EDuration="2m13.997709108s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.951076439 +0000 UTC m=+156.831756440" watchObservedRunningTime="2025-12-02 22:44:53.997709108 +0000 UTC m=+156.878389109" Dec 02 22:44:53 crc kubenswrapper[4696]: I1202 22:44:53.999058 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" podStartSLOduration=133.999051118 podStartE2EDuration="2m13.999051118s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:53.997490511 +0000 UTC m=+156.878170512" watchObservedRunningTime="2025-12-02 22:44:53.999051118 +0000 UTC m=+156.879731119" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.040483 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.040650 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.540621277 +0000 UTC m=+157.421301278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.040857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.041293 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.541275216 +0000 UTC m=+157.421955217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.142343 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.143137 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.643120329 +0000 UTC m=+157.523800330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.177932 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.178320 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.246633 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.247080 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.747066653 +0000 UTC m=+157.627746654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.265876 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.266317 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.269924 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" event={"ID":"1198dca5-128d-4625-95e1-2d9d9a86b263","Type":"ContainerStarted","Data":"b9e6d7be271ad393ed03fd3abd4d0d49e22d0478a05a53c81aabd5854cdef46b"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.298329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" event={"ID":"1b7406ed-3f1e-476e-9d81-a0612ad16002","Type":"ContainerStarted","Data":"6eed3a3643cc72695623696f0e978eea29c9d4468c00cd3bc206aae041fba306"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.308654 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.310652 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" event={"ID":"02277212-4ddf-49e2-8ac0-58c20fd973a3","Type":"ContainerStarted","Data":"a0d51d44766741dc520fd2d8a7606b890f91b8e80dfc5e88fdb40163b287cccb"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.310715 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6bghq" podStartSLOduration=134.310684925 podStartE2EDuration="2m14.310684925s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.298286878 +0000 UTC m=+157.178966879" watchObservedRunningTime="2025-12-02 22:44:54.310684925 +0000 UTC m=+157.191364926" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.312764 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" podStartSLOduration=135.312756326 podStartE2EDuration="2m15.312756326s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.08499746 +0000 UTC m=+156.965677461" watchObservedRunningTime="2025-12-02 22:44:54.312756326 +0000 UTC m=+157.193436327" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.317831 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.328365 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.329906 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" event={"ID":"db82b3f2-cd94-42ef-a662-8a4b4c8fac85","Type":"ContainerStarted","Data":"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.330477 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.345110 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jtr26" event={"ID":"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9","Type":"ContainerStarted","Data":"042ab73ebeb05ff4dd4e2edb5ff666626e25dae392c0954f909b73beac4c7b3c"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.348418 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.349007 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.848984488 +0000 UTC m=+157.729664489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.350192 4696 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dtrjk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.350229 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.352782 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.381433 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6s4j2" event={"ID":"54d79d46-f5a5-42d4-8000-61e91b72e0f7","Type":"ContainerStarted","Data":"f168ec98f0d158d43bb248a2b2ce2c57c87a622da65c35c2e19d375d1561cecb"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.381474 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6s4j2" event={"ID":"54d79d46-f5a5-42d4-8000-61e91b72e0f7","Type":"ContainerStarted","Data":"b75995affd4c9f7c81c8eb4ae165548206f4d1d94ae617f7624c6cd12b5fd4c0"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.382549 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhp72"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.389625 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qpwvl" event={"ID":"fb0f1308-0862-4ac2-b741-849ae33c2776","Type":"ContainerStarted","Data":"e7f321b1b54e78fa92437f4da3fefbb43c2d20d4ba19d27569c10971dde36de1"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.402044 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" podStartSLOduration=134.402018506 podStartE2EDuration="2m14.402018506s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.379562942 +0000 UTC m=+157.260242943" watchObservedRunningTime="2025-12-02 22:44:54.402018506 +0000 UTC m=+157.282698517" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.409535 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6s4j2" podStartSLOduration=6.409513878 podStartE2EDuration="6.409513878s" podCreationTimestamp="2025-12-02 22:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.403391737 +0000 UTC m=+157.284071758" watchObservedRunningTime="2025-12-02 22:44:54.409513878 +0000 UTC m=+157.290193879" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.414555 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" event={"ID":"a4ed1132-b227-4d20-9ca7-7530ff4c0163","Type":"ContainerStarted","Data":"8a0c47207c9e3f65a76ee89bcfcb52bb1c3c30251ca82d3065bfa10a8643f4d8"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.420450 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" event={"ID":"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef","Type":"ContainerStarted","Data":"ba2992bf060ed354ca9bfa75b836d171ec07c0c17e73ae1445d0b89f9aacb65b"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.420506 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" event={"ID":"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef","Type":"ContainerStarted","Data":"4db2e2ba61c8c65f4f97bea26ce7f3ad77410f8ce62d15102f36ac9fe367f5e0"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.420851 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4jgzb"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.444248 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jwz6f" podStartSLOduration=134.444227924 podStartE2EDuration="2m14.444227924s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.438369741 +0000 UTC m=+157.319049742" watchObservedRunningTime="2025-12-02 22:44:54.444227924 +0000 UTC m=+157.324907925" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.452031 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.453958 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:54.953935052 +0000 UTC m=+157.834615263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.456621 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" event={"ID":"f72e09f8-e3a5-4434-b1b7-3978e84c472a","Type":"ContainerStarted","Data":"0d7dd3f7503cfb222af20f42552e8bbc28d4ee65fe82642e92ef12af455a7c83"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.490478 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" podStartSLOduration=135.490443871 podStartE2EDuration="2m15.490443871s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.476464578 +0000 UTC m=+157.357144589" watchObservedRunningTime="2025-12-02 22:44:54.490443871 +0000 UTC m=+157.371123872" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.505982 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" event={"ID":"3c4885ba-062b-41c3-be71-997c9781e537","Type":"ContainerStarted","Data":"aa9f435fe8fc3b614fe914ed0c590f2a52786905c26f38a51bd876a71958ca5d"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.532410 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jhb9"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.532467 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" podStartSLOduration=135.532444894 podStartE2EDuration="2m15.532444894s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.509177075 +0000 UTC m=+157.389857076" watchObservedRunningTime="2025-12-02 22:44:54.532444894 +0000 UTC m=+157.413124895" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.532488 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" event={"ID":"3a98fb36-4189-4e59-b07a-b42a8aaee322","Type":"ContainerStarted","Data":"e88605d5d44ca0ca3d2e352b1a2c9050f188779b32665c19c9bfb3e00beab99d"} Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.542316 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7"] Dec 02 22:44:54 crc kubenswrapper[4696]: W1202 22:44:54.543372 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd55522_63bd_40f3_a429_eb0c85fe5b9c.slice/crio-b25396f1a5468d0315542c26fa59f6ae0f6bfb10eabbadc7ff70ff3d68ed1295 WatchSource:0}: Error finding container b25396f1a5468d0315542c26fa59f6ae0f6bfb10eabbadc7ff70ff3d68ed1295: Status 404 returned error can't find the container with id b25396f1a5468d0315542c26fa59f6ae0f6bfb10eabbadc7ff70ff3d68ed1295 Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.560516 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.561646 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9gpsg" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.561498 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.561561 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.061543154 +0000 UTC m=+157.942223145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.562215 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.561283 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" podStartSLOduration=134.561265266 podStartE2EDuration="2m14.561265266s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.556385312 +0000 UTC m=+157.437065313" watchObservedRunningTime="2025-12-02 22:44:54.561265266 +0000 UTC m=+157.441945267" Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.569183 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.06916668 +0000 UTC m=+157.949846681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.587884 4696 patch_prober.go:28] interesting pod/apiserver-76f77b778f-8fxbs container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]log ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]etcd ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/max-in-flight-filter ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 02 22:44:54 crc kubenswrapper[4696]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 02 22:44:54 crc kubenswrapper[4696]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/project.openshift.io-projectcache ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-startinformers ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 02 22:44:54 crc kubenswrapper[4696]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 22:44:54 crc kubenswrapper[4696]: livez check failed Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.587968 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" podUID="092e9b7e-6772-4cde-89b7-de81ae50222e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.605832 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.618277 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" podStartSLOduration=134.618246771 podStartE2EDuration="2m14.618246771s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:54.60737273 +0000 UTC m=+157.488052731" watchObservedRunningTime="2025-12-02 22:44:54.618246771 +0000 UTC m=+157.498926772" Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.621884 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5df2"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.642667 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.653148 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7pzs"] Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.667352 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.668432 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk"] Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.676149 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.176111033 +0000 UTC m=+158.056791034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.773811 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: W1202 22:44:54.774440 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772fa8b4_e1ff_40c2_aaaa_1b5c12143dac.slice/crio-e0fea9518475c9d40328b8fe6d9fc4707859392e24f08c369bacf03752f4153d WatchSource:0}: Error finding container e0fea9518475c9d40328b8fe6d9fc4707859392e24f08c369bacf03752f4153d: Status 404 returned error can't find the container with id e0fea9518475c9d40328b8fe6d9fc4707859392e24f08c369bacf03752f4153d Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.774649 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.274635217 +0000 UTC m=+158.155315218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.875917 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.375888102 +0000 UTC m=+158.256568103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.875946 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.876283 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.876648 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.376641244 +0000 UTC m=+158.257321245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:54 crc kubenswrapper[4696]: I1202 22:44:54.977682 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:54 crc kubenswrapper[4696]: E1202 22:44:54.978066 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.478047463 +0000 UTC m=+158.358727464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.079137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.079591 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.579570966 +0000 UTC m=+158.460250967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.180651 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.181112 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.681061968 +0000 UTC m=+158.561741979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.181383 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.181815 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.68179866 +0000 UTC m=+158.562478661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.284270 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.284485 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.784445575 +0000 UTC m=+158.665125576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.284572 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.284968 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.78495462 +0000 UTC m=+158.665634621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.386913 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.387936 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.887918206 +0000 UTC m=+158.768598207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.490008 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.490523 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:55.99050565 +0000 UTC m=+158.871185651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.530772 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s5dx8" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.549431 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jtr26" event={"ID":"6ba68eb4-959e-4a8a-a35e-f50abaef4cf9","Type":"ContainerStarted","Data":"b94b4ad658a50d3a6cafa4c55d19a436434a56f84ebf6947c1fbebb0bfabce9c"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.552360 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p6njc" event={"ID":"3c4885ba-062b-41c3-be71-997c9781e537","Type":"ContainerStarted","Data":"3ec5451f881d5fd3654abe76130daa0dbcb6de4fe4b3b2c938b0f1a9ae5e08bb"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.557808 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" event={"ID":"f72e09f8-e3a5-4434-b1b7-3978e84c472a","Type":"ContainerStarted","Data":"0a6c40d5d17d93a946f03ef5694ab20479cac98cf8db4b8bb35b1f2f0a937352"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.559963 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" event={"ID":"104b39f4-a0ab-4b71-a5d9-7c2cafd48976","Type":"ContainerStarted","Data":"de31e418f95a545967371e2ddb62a068b4bb3b42b17254e67b00b09a0dd27ff2"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.573085 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" event={"ID":"1b7406ed-3f1e-476e-9d81-a0612ad16002","Type":"ContainerStarted","Data":"918f69240afc220fae1bfa4f680d6dd4ec2015074f1310e5c1d73ee340940670"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.592022 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.592553 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.092536158 +0000 UTC m=+158.973216159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.595291 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qpwvl" event={"ID":"fb0f1308-0862-4ac2-b741-849ae33c2776","Type":"ContainerStarted","Data":"e9298814738e622e39206a8ef82e29a5f860d8ffd0941eb00fc6eeceadd1da65"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.597257 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" event={"ID":"bfd55522-63bd-40f3-a429-eb0c85fe5b9c","Type":"ContainerStarted","Data":"b25396f1a5468d0315542c26fa59f6ae0f6bfb10eabbadc7ff70ff3d68ed1295"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.597959 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" event={"ID":"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af","Type":"ContainerStarted","Data":"95c47ce943f8239e095c6fe913a8d9e1141baaaeb57ae85162a10b7b013451db"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.599814 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" event={"ID":"4feef0d1-3fad-4990-8d96-65beb52e89b3","Type":"ContainerStarted","Data":"825aab9a05eab8519c372c611a08e3c0a6756dd19e7c3a15dc76bdc2a09ae433"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.608076 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" event={"ID":"e9c0bdb8-cf1f-408c-b341-afb3fea179dd","Type":"ContainerStarted","Data":"7804e53865580f86b126333274ff969aa1bcec1dc9118b37e6decb85e36b71e3"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.611790 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" event={"ID":"e9c0bdb8-cf1f-408c-b341-afb3fea179dd","Type":"ContainerStarted","Data":"6871817adcbdf9624c60d6a5646eec5ea7fe33d6397eab8d354726e5989d4d38"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.617916 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" event={"ID":"96c61760-b182-43b4-a0d8-a461ba742b85","Type":"ContainerStarted","Data":"6ca27c5462d123ba9e66e40d43d6a8506393bc9fec53d5ebaf0e488933a4d4d3"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.617991 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" event={"ID":"96c61760-b182-43b4-a0d8-a461ba742b85","Type":"ContainerStarted","Data":"d647e3a688b6a662e9eb7fb2eb606acdaa4a97b7f838003bc83a0f4981193925"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.619054 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.639020 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" event={"ID":"3a98fb36-4189-4e59-b07a-b42a8aaee322","Type":"ContainerStarted","Data":"52deb162a4f8c1ea30e88ca1ba2b2509f56797f134b19cd3f39e92e995db534c"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.640442 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.641732 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.648650 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" event={"ID":"32ed738d-b1d3-4966-a487-1c2aa92c6f20","Type":"ContainerStarted","Data":"bc115f6aa6420c030fc8225819af55ad9f221fe4bb132154cbc73cd05fdfa280"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.658561 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jtr26" podStartSLOduration=7.65853843 podStartE2EDuration="7.65853843s" podCreationTimestamp="2025-12-02 22:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.621369421 +0000 UTC m=+158.502049422" watchObservedRunningTime="2025-12-02 22:44:55.65853843 +0000 UTC m=+158.539218441" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.659811 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qxljp" podStartSLOduration=135.659803447 podStartE2EDuration="2m15.659803447s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.657835889 +0000 UTC m=+158.538515900" watchObservedRunningTime="2025-12-02 22:44:55.659803447 +0000 UTC m=+158.540483448" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.685128 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4cx4s" event={"ID":"5f5d77f1-55f7-49d2-b5db-19d7150b882f","Type":"ContainerStarted","Data":"adeef89d8acfad386839bfc6164c5126b69930af23090eba71796097a0f9b0d6"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.696765 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.700631 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.200612684 +0000 UTC m=+159.081292685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.725296 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw26l" podStartSLOduration=135.725273914 podStartE2EDuration="2m15.725273914s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.691262588 +0000 UTC m=+158.571942589" watchObservedRunningTime="2025-12-02 22:44:55.725273914 +0000 UTC m=+158.605953915" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.750627 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" event={"ID":"ed9ebc4a-9236-49f7-b365-79a6890e2bc8","Type":"ContainerStarted","Data":"de6eabb00a83f5ebd4ec61f4efe647f0b6e2df5f114a64ad16c59533cbb7aeba"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.772664 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-58dmz" podStartSLOduration=135.772642865 podStartE2EDuration="2m15.772642865s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.726291514 +0000 UTC m=+158.606971515" watchObservedRunningTime="2025-12-02 22:44:55.772642865 +0000 UTC m=+158.653322866" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.800677 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qpwvl" podStartSLOduration=135.800651813 podStartE2EDuration="2m15.800651813s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.773925743 +0000 UTC m=+158.654605744" watchObservedRunningTime="2025-12-02 22:44:55.800651813 +0000 UTC m=+158.681331814" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.800721 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jgzb" event={"ID":"5b22a349-1f5e-49f6-982f-e192aed0933d","Type":"ContainerStarted","Data":"776fcffebad743110b55a60ead9039e208ca470dfe23213814a3b2aed85c05e4"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.800790 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jgzb" event={"ID":"5b22a349-1f5e-49f6-982f-e192aed0933d","Type":"ContainerStarted","Data":"f444dc43198978195cd9a8d967199f9062486e0022a3c71af56214d0225290c0"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.801681 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.801865 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.301843418 +0000 UTC m=+159.182523419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.802040 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.811233 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.311212275 +0000 UTC m=+159.191892266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.812596 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" event={"ID":"02277212-4ddf-49e2-8ac0-58c20fd973a3","Type":"ContainerStarted","Data":"213e02f5fd8c848a7970aad22cae32258b70ef59f388f43fa67bd4498990b0d1"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.812638 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" event={"ID":"02277212-4ddf-49e2-8ac0-58c20fd973a3","Type":"ContainerStarted","Data":"3629a47025114ef8d3db659eb45fd497cc12118b0f2639c681d2482d8386b630"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.834958 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" event={"ID":"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac","Type":"ContainerStarted","Data":"e0fea9518475c9d40328b8fe6d9fc4707859392e24f08c369bacf03752f4153d"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.839765 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" event={"ID":"7fe15835-a31c-46df-aef5-21aade83fa88","Type":"ContainerStarted","Data":"7865f931faac9db3bf4f40ad74f397bb4039de2a77874edb4143934c50f97182"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.840838 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" event={"ID":"94966c3d-5233-480c-a199-6813c47a1e04","Type":"ContainerStarted","Data":"0703b9e7da4f5eee40ace4ad50d44bdf083971d54ede18ab01ab4ff7f020d46a"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.840863 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" event={"ID":"94966c3d-5233-480c-a199-6813c47a1e04","Type":"ContainerStarted","Data":"bec1b7f51ea1d2c9af2113a79664960988adf32bc2f4ef6fb401732a94b39bd0"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.842098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" event={"ID":"7893eeb2-3d1b-4e3d-8752-438c53238019","Type":"ContainerStarted","Data":"915df6ca9923bf871b452049a240d3509cfa4869caf76c91f896cd70ebc764fc"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.842125 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" event={"ID":"7893eeb2-3d1b-4e3d-8752-438c53238019","Type":"ContainerStarted","Data":"13cd757174e8f0557a1551086a8e23684321054b1de8a82f29c402d9ecd1dd53"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.844166 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" event={"ID":"238dedc6-7624-4ad3-9e79-4d536ac8acda","Type":"ContainerStarted","Data":"0226567a04473e9a75439374b31f67ce3c5bb265bb533d8e940ab09574c0cdf7"} Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.852368 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j5tcx" podStartSLOduration=135.852346952 podStartE2EDuration="2m15.852346952s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:55.851474406 +0000 UTC m=+158.732154417" watchObservedRunningTime="2025-12-02 22:44:55.852346952 +0000 UTC m=+158.733026953" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.858197 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:44:55 crc kubenswrapper[4696]: I1202 22:44:55.903806 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:55 crc kubenswrapper[4696]: E1202 22:44:55.904801 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.404767392 +0000 UTC m=+159.285447383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.005943 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.007558 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.507542472 +0000 UTC m=+159.388222473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.109439 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.109689 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.609652302 +0000 UTC m=+159.490332303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.109997 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.110410 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.610402424 +0000 UTC m=+159.491082425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.166684 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.180800 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:44:56 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:44:56 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:44:56 crc kubenswrapper[4696]: healthz check failed Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.180882 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.211002 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.211274 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.711222706 +0000 UTC m=+159.591902707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.211344 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.211680 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.71166703 +0000 UTC m=+159.592347031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.312162 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.312332 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.812306946 +0000 UTC m=+159.692986947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.312562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.312934 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.812920124 +0000 UTC m=+159.693600125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.414539 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.414721 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.914680544 +0000 UTC m=+159.795360545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.414962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.415350 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:56.915339183 +0000 UTC m=+159.796019184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.426527 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v2rkn" Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.515988 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.516459 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.016443734 +0000 UTC m=+159.897123735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.617969 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.618536 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.118500262 +0000 UTC m=+159.999180263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.719395 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.719580 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.219555521 +0000 UTC m=+160.100235522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.719798 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.720126 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.220118548 +0000 UTC m=+160.100798539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.820891 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.821314 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.32129157 +0000 UTC m=+160.201971571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.850109 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" event={"ID":"238dedc6-7624-4ad3-9e79-4d536ac8acda","Type":"ContainerStarted","Data":"1382867d40bcf58211f3cc9ce31d750add6ee9c9386ebb50e28ee9e948b03a9d"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.851058 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" event={"ID":"32ed738d-b1d3-4966-a487-1c2aa92c6f20","Type":"ContainerStarted","Data":"1e6bdb778d0560b9bca7bb1032217cd7ffb2e88f22c2a3a26988c460d0b27d71"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.851959 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" event={"ID":"ed9ebc4a-9236-49f7-b365-79a6890e2bc8","Type":"ContainerStarted","Data":"d0035d87df1d6e51627ee719ea00f70da3aa7ffc524c3c190919ebb57a717b6d"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.853047 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" event={"ID":"94966c3d-5233-480c-a199-6813c47a1e04","Type":"ContainerStarted","Data":"2254efae4dfd38ab6f664327f4a084f8dda71ed51adac2837499e6b2ab669dec"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.853647 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.855166 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" event={"ID":"e9c0bdb8-cf1f-408c-b341-afb3fea179dd","Type":"ContainerStarted","Data":"18649d116856a59cdf7806ca3e66978fc531601df1c5df02694675640239160b"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.856528 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" event={"ID":"f72e09f8-e3a5-4434-b1b7-3978e84c472a","Type":"ContainerStarted","Data":"53618d6708e2ad799be2950c1e093db4fc59a1e5b6e2c70c7bd5d8c75902e746"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.857515 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" event={"ID":"7fe15835-a31c-46df-aef5-21aade83fa88","Type":"ContainerStarted","Data":"87e4658060ee0a4aead9de75870338a572ee96b8361281c2a3751a7af97f5663"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.868221 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" event={"ID":"bfd55522-63bd-40f3-a429-eb0c85fe5b9c","Type":"ContainerStarted","Data":"bfd10d939d31438de0c09ae8b15fcfe2a6cb519b0b68f9f1c59c70f177372257"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.869833 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" event={"ID":"4feef0d1-3fad-4990-8d96-65beb52e89b3","Type":"ContainerStarted","Data":"67005b0f6333fef16ea8e844a00123a8079f6a6860a8f20eb3b896636213e3f5"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.877066 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" event={"ID":"7893eeb2-3d1b-4e3d-8752-438c53238019","Type":"ContainerStarted","Data":"0d78d5d3422457df6fbe2c881b0f09d57fc5e6b5285d0584c7780edaf8108e67"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.882193 4696 generic.go:334] "Generic (PLEG): container finished" podID="65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" containerID="ba2992bf060ed354ca9bfa75b836d171ec07c0c17e73ae1445d0b89f9aacb65b" exitCode=0 Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.882277 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" event={"ID":"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef","Type":"ContainerDied","Data":"ba2992bf060ed354ca9bfa75b836d171ec07c0c17e73ae1445d0b89f9aacb65b"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.886492 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" podStartSLOduration=136.886471188 podStartE2EDuration="2m16.886471188s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:56.882276754 +0000 UTC m=+159.762956825" watchObservedRunningTime="2025-12-02 22:44:56.886471188 +0000 UTC m=+159.767151189" Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.887528 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" event={"ID":"772fa8b4-e1ff-40c2-aaaa-1b5c12143dac","Type":"ContainerStarted","Data":"3db5dcca3e3104a43eda0487a00cc7abc764a4de2663fcf9cf3950a03f463cb1"} Dec 02 22:44:56 crc kubenswrapper[4696]: I1202 22:44:56.922412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:56 crc kubenswrapper[4696]: E1202 22:44:56.922823 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.422802713 +0000 UTC m=+160.303482714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.023258 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.023410 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.523385748 +0000 UTC m=+160.404065749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.023610 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.024527 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.524496 +0000 UTC m=+160.405176011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.127451 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.127643 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.62761123 +0000 UTC m=+160.508291231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.127788 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.128209 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.628201328 +0000 UTC m=+160.508881319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.171687 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:44:57 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:44:57 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:44:57 crc kubenswrapper[4696]: healthz check failed Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.171866 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.229454 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.229635 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.729589416 +0000 UTC m=+160.610269417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.230174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.230627 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.730618797 +0000 UTC m=+160.611298798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.332025 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.332299 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.832258173 +0000 UTC m=+160.712938174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.338033 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.338459 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.838434976 +0000 UTC m=+160.719114977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.439953 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.440160 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.940132413 +0000 UTC m=+160.820812414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.440449 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.440845 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:57.940835554 +0000 UTC m=+160.821515555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.541654 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.541823 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.041796849 +0000 UTC m=+160.922476860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.542173 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.542626 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.042609423 +0000 UTC m=+160.923289434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.643520 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.643768 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.143717134 +0000 UTC m=+161.024397145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.643959 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.644402 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.144392184 +0000 UTC m=+161.025072185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.745154 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.745361 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.245328589 +0000 UTC m=+161.126008590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.745527 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.745892 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.245885316 +0000 UTC m=+161.126565307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.847185 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.847387 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.347354547 +0000 UTC m=+161.228034548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.847895 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.848274 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.348266434 +0000 UTC m=+161.228946435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.894445 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jgzb" event={"ID":"5b22a349-1f5e-49f6-982f-e192aed0933d","Type":"ContainerStarted","Data":"9a9900956a8b9d5722ee2c90dc1fd1f36aac572f39c6dc8feef06ebffd58d1fb"} Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.919905 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xbr4" podStartSLOduration=137.919878912 podStartE2EDuration="2m17.919878912s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:57.919393687 +0000 UTC m=+160.800073688" watchObservedRunningTime="2025-12-02 22:44:57.919878912 +0000 UTC m=+160.800558913" Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.950386 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:57 crc kubenswrapper[4696]: E1202 22:44:57.950756 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.450723474 +0000 UTC m=+161.331403475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.951432 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k7pzs" podStartSLOduration=137.951412714 podStartE2EDuration="2m17.951412714s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:57.949948741 +0000 UTC m=+160.830628742" watchObservedRunningTime="2025-12-02 22:44:57.951412714 +0000 UTC m=+160.832092715" Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.979211 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bhp72" podStartSLOduration=137.979193156 podStartE2EDuration="2m17.979193156s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:57.975886378 +0000 UTC m=+160.856566379" watchObservedRunningTime="2025-12-02 22:44:57.979193156 +0000 UTC m=+160.859873157" Dec 02 22:44:57 crc kubenswrapper[4696]: I1202 22:44:57.995300 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-58dc7" podStartSLOduration=137.995276212 podStartE2EDuration="2m17.995276212s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:57.992423807 +0000 UTC m=+160.873103808" watchObservedRunningTime="2025-12-02 22:44:57.995276212 +0000 UTC m=+160.875956213" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.052439 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.054255 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.554237506 +0000 UTC m=+161.434917507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.160595 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.160925 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.66090601 +0000 UTC m=+161.541586011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.175930 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:44:58 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:44:58 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:44:58 crc kubenswrapper[4696]: healthz check failed Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.176023 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.245839 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.247723 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.248132 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.252534 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.252725 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.258386 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.266976 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume\") pod \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.267069 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks74t\" (UniqueName: \"kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t\") pod \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.267199 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume\") pod \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\" (UID: \"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef\") " Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.267340 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.267387 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.267421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.269415 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume" (OuterVolumeSpecName: "config-volume") pod "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" (UID: "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.270020 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.769998307 +0000 UTC m=+161.650678308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.286504 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" (UID: "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.286600 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t" (OuterVolumeSpecName: "kube-api-access-ks74t") pod "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" (UID: "65e8a28b-6bd6-4100-a3e7-80faf9aaeeef"). InnerVolumeSpecName "kube-api-access-ks74t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368270 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.368403 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.868384137 +0000 UTC m=+161.749064138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368459 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368514 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368552 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368564 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks74t\" (UniqueName: \"kubernetes.io/projected/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-kube-api-access-ks74t\") on node \"crc\" DevicePath \"\"" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368572 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.368600 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.368982 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.868960964 +0000 UTC m=+161.749640965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.387597 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.469502 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.469785 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.969719554 +0000 UTC m=+161.850399555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.469970 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.470592 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:58.97058481 +0000 UTC m=+161.851264811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.576429 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.576656 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.076619716 +0000 UTC m=+161.957299717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.577350 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.577848 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.077829902 +0000 UTC m=+161.958509903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.589109 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.605109 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.609561 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" containerName="collect-profiles" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.609586 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" containerName="collect-profiles" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.620401 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" containerName="collect-profiles" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.621994 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.625016 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.681354 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.681855 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.181835038 +0000 UTC m=+162.062515039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.682051 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.766930 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.777098 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.783331 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784466 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fckxn\" (UniqueName: \"kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784501 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784524 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784566 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784585 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784600 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzxv\" (UniqueName: \"kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784764 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.784800 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.784900 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.284887916 +0000 UTC m=+162.165567917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887304 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.887411 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.387390147 +0000 UTC m=+162.268070148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887728 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887791 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887823 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fckxn\" (UniqueName: \"kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887849 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887870 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887922 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.887965 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzxv\" (UniqueName: \"kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.889263 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.893895 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.894077 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.894890 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.394869309 +0000 UTC m=+162.275549310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.897156 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.919395 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fckxn\" (UniqueName: \"kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn\") pod \"community-operators-8jbfs\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.927002 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzxv\" (UniqueName: \"kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv\") pod \"certified-operators-9rp8k\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.935979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" event={"ID":"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af","Type":"ContainerStarted","Data":"0bd24c3280918d5c1243939430b1f085a6148f89b6a494191917992c1d6598a0"} Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.959183 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" event={"ID":"ed9ebc4a-9236-49f7-b365-79a6890e2bc8","Type":"ContainerStarted","Data":"bf148412a83017cfdd5922c19727159fd4f36c0a5402cb587bb202e7e12bab8c"} Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.965012 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42w2d"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.969557 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.972680 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" event={"ID":"4feef0d1-3fad-4990-8d96-65beb52e89b3","Type":"ContainerStarted","Data":"72b79410ce99cb8494c1e44693277b8b9483b5e986e4fb505a5319cfe1297b47"} Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.985280 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.985334 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z" event={"ID":"65e8a28b-6bd6-4100-a3e7-80faf9aaeeef","Type":"ContainerDied","Data":"4db2e2ba61c8c65f4f97bea26ce7f3ad77410f8ce62d15102f36ac9fe367f5e0"} Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.985381 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db2e2ba61c8c65f4f97bea26ce7f3ad77410f8ce62d15102f36ac9fe367f5e0" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.986005 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.986485 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4jgzb" Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.988604 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:58 crc kubenswrapper[4696]: E1202 22:44:58.988950 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.488931441 +0000 UTC m=+162.369611442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.989412 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42w2d"] Dec 02 22:44:58 crc kubenswrapper[4696]: I1202 22:44:58.991172 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qghph" podStartSLOduration=138.991145426 podStartE2EDuration="2m18.991145426s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:58.979947425 +0000 UTC m=+161.860627426" watchObservedRunningTime="2025-12-02 22:44:58.991145426 +0000 UTC m=+161.871825427" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.018814 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5rtdf" podStartSLOduration=139.018768073 podStartE2EDuration="2m19.018768073s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.014132376 +0000 UTC m=+161.894812377" watchObservedRunningTime="2025-12-02 22:44:59.018768073 +0000 UTC m=+161.899448074" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.048837 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.090729 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.098844 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvpk\" (UniqueName: \"kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.099118 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.099327 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.100765 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.600723067 +0000 UTC m=+162.481403068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.104549 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jtktn" podStartSLOduration=139.10452845 podStartE2EDuration="2m19.10452845s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.099168901 +0000 UTC m=+161.979848902" watchObservedRunningTime="2025-12-02 22:44:59.10452845 +0000 UTC m=+161.985208451" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.109220 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.135488 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5df2" podStartSLOduration=140.131334192 podStartE2EDuration="2m20.131334192s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.129329253 +0000 UTC m=+162.010009254" watchObservedRunningTime="2025-12-02 22:44:59.131334192 +0000 UTC m=+162.012014193" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.155953 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" podStartSLOduration=139.15593144 podStartE2EDuration="2m19.15593144s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.152122897 +0000 UTC m=+162.032802898" watchObservedRunningTime="2025-12-02 22:44:59.15593144 +0000 UTC m=+162.036611441" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.169626 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xhnsb"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.171367 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.182604 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:44:59 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:44:59 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:44:59 crc kubenswrapper[4696]: healthz check failed Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.182701 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.183633 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.200542 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.200993 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.700963062 +0000 UTC m=+162.581643063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201092 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201129 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvpk\" (UniqueName: \"kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201191 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201404 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201429 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201505 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtht\" (UniqueName: \"kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.201767 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.203480 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.204036 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.704023682 +0000 UTC m=+162.584703873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.205656 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8fxbs" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.215797 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhnsb"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.217443 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kgqqs" podStartSLOduration=139.217422549 podStartE2EDuration="2m19.217422549s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.185271778 +0000 UTC m=+162.065951779" watchObservedRunningTime="2025-12-02 22:44:59.217422549 +0000 UTC m=+162.098102550" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.229162 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvpk\" (UniqueName: \"kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk\") pod \"community-operators-42w2d\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.264210 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4jgzb" podStartSLOduration=11.264186682 podStartE2EDuration="11.264186682s" podCreationTimestamp="2025-12-02 22:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:44:59.220723246 +0000 UTC m=+162.101403247" watchObservedRunningTime="2025-12-02 22:44:59.264186682 +0000 UTC m=+162.144866683" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.303308 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.303506 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.303527 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.303562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtht\" (UniqueName: \"kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.304953 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.804725541 +0000 UTC m=+162.685405542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.305399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.305626 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.315718 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.351138 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtht\" (UniqueName: \"kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht\") pod \"certified-operators-xhnsb\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.395918 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.405192 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.405632 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:44:59.905618685 +0000 UTC m=+162.786298676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.442897 4696 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.478524 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.502118 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.505769 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.506621 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:45:00.006593621 +0000 UTC m=+162.887273622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.608331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.608844 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:45:00.108822235 +0000 UTC m=+162.989502236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.676172 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42w2d"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.711721 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.711981 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:45:00.211951665 +0000 UTC m=+163.092631666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.712138 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: E1202 22:44:59.712540 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 22:45:00.212528912 +0000 UTC m=+163.093208913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rjkp2" (UID: "24488e8a-3522-4214-ab83-684d76eb1501") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.766987 4696 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T22:44:59.442919548Z","Handler":null,"Name":""} Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.798036 4696 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.798082 4696 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.812806 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.817503 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.853924 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhnsb"] Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.917028 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.921069 4696 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.921124 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.955166 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-jhcjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.955234 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jhcjw" podUID="3f45ac7c-8865-4924-8dbd-5826a21d028e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.955285 4696 patch_prober.go:28] interesting pod/downloads-7954f5f757-jhcjw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.955358 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jhcjw" podUID="3f45ac7c-8865-4924-8dbd-5826a21d028e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 02 22:44:59 crc kubenswrapper[4696]: I1202 22:44:59.959562 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rjkp2\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.040783 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" event={"ID":"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af","Type":"ContainerStarted","Data":"0c652d3a5535bfd02f4bdb2d816bcf8fef65b388e03e2d44de81c8260f7e7da8"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.042690 4696 generic.go:334] "Generic (PLEG): container finished" podID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerID="dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77" exitCode=0 Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.042771 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerDied","Data":"dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.042807 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerStarted","Data":"d6bc612d5ace68919cac0c789e069bdeef5ad9824d50aa2d42f68d88b202d6b5"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.043917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerStarted","Data":"4d004fcb5320aeceac48622d766b6dc2fd6375afd63fccaa41a693642ec8c5f6"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.045311 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.045577 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerStarted","Data":"cabcd8086c0440a5c5ce00d2a111e829c6bfc81d0ad64544f155f2ad9b6936e1"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.047867 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d275d4f-c887-4a23-a84e-e1d7305a5ba0","Type":"ContainerStarted","Data":"4eeaa48e83d6b658e6c289616d2598e8ac151c22ac2dd59881bbc7cbb3d7f920"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.047937 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d275d4f-c887-4a23-a84e-e1d7305a5ba0","Type":"ContainerStarted","Data":"1048093b38554d535b9527a474c7a13cbf37c9c5b8bf06c39d745107aa396d5a"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.050781 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerStarted","Data":"4e4904b82229c82ddbb633cad2d4bf3dd9db7a2e072421c491b1db4d39adc0c9"} Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.091635 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.091606674 podStartE2EDuration="2.091606674s" podCreationTimestamp="2025-12-02 22:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:00.090653766 +0000 UTC m=+162.971333767" watchObservedRunningTime="2025-12-02 22:45:00.091606674 +0000 UTC m=+162.972286675" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.168402 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.171472 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.172315 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:45:00 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:45:00 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:45:00 crc kubenswrapper[4696]: healthz check failed Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.172427 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.172362 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.180953 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.191410 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.195135 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.230700 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.230794 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.233722 4696 patch_prober.go:28] interesting pod/console-f9d7485db-f6wj6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.235925 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-f6wj6" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.336940 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.338101 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bmf\" (UniqueName: \"kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.338182 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.438912 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.439004 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.439069 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bmf\" (UniqueName: \"kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.440128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.448261 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.452199 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.460426 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bmf\" (UniqueName: \"kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf\") pod \"collect-profiles-29411925-w4hb7\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: W1202 22:45:00.480672 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24488e8a_3522_4214_ab83_684d76eb1501.slice/crio-4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61 WatchSource:0}: Error finding container 4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61: Status 404 returned error can't find the container with id 4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61 Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.535271 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.564003 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.566801 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.569448 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.577253 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.646522 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.646574 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxrp\" (UniqueName: \"kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.646734 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.748650 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.749204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxrp\" (UniqueName: \"kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.749231 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.749986 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.751123 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.793145 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxrp\" (UniqueName: \"kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp\") pod \"redhat-marketplace-6ccj5\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.797321 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7"] Dec 02 22:45:00 crc kubenswrapper[4696]: W1202 22:45:00.808513 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60748893_12de_4ee9_9a98_d2e6117d2247.slice/crio-eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08 WatchSource:0}: Error finding container eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08: Status 404 returned error can't find the container with id eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08 Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.928966 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.976246 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp"] Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.977629 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:00 crc kubenswrapper[4696]: I1202 22:45:00.993106 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp"] Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.052661 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjg6v\" (UniqueName: \"kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.053231 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.053272 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.071118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" event={"ID":"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af","Type":"ContainerStarted","Data":"060a4f4fb571bf2a362365c50960349756cc5aae2e23c9a9f9eeffbd6f4bdbbd"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.071172 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" event={"ID":"e1e19685-0a6c-47e3-b4d6-b98aa6ca63af","Type":"ContainerStarted","Data":"0c355811b7adfbf9e01a25671726294097a931f48b01a611dccbaca5952b90e7"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.074767 4696 generic.go:334] "Generic (PLEG): container finished" podID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerID="5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec" exitCode=0 Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.074847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerDied","Data":"5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.080034 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" event={"ID":"24488e8a-3522-4214-ab83-684d76eb1501","Type":"ContainerStarted","Data":"0ee8494271b38f77e7784c7aeccd2db24ace23e233f86fc0a594380408c0289a"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.080079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" event={"ID":"24488e8a-3522-4214-ab83-684d76eb1501","Type":"ContainerStarted","Data":"4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.084885 4696 generic.go:334] "Generic (PLEG): container finished" podID="4b259842-7ad5-413d-a13c-73c0931b6527" containerID="1e16d8d26bb7406919209731c1caca523976645570dc7f95a1af9014a627a8e4" exitCode=0 Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.084988 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerDied","Data":"1e16d8d26bb7406919209731c1caca523976645570dc7f95a1af9014a627a8e4"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.089574 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d275d4f-c887-4a23-a84e-e1d7305a5ba0" containerID="4eeaa48e83d6b658e6c289616d2598e8ac151c22ac2dd59881bbc7cbb3d7f920" exitCode=0 Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.089617 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d275d4f-c887-4a23-a84e-e1d7305a5ba0","Type":"ContainerDied","Data":"4eeaa48e83d6b658e6c289616d2598e8ac151c22ac2dd59881bbc7cbb3d7f920"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.095028 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2jhb9" podStartSLOduration=13.09501261 podStartE2EDuration="13.09501261s" podCreationTimestamp="2025-12-02 22:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:01.091592709 +0000 UTC m=+163.972272710" watchObservedRunningTime="2025-12-02 22:45:01.09501261 +0000 UTC m=+163.975692611" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.099312 4696 generic.go:334] "Generic (PLEG): container finished" podID="4eb44562-c608-428a-8812-428de40cbcde" containerID="e28fef83921bba493531f6df229961eb19d003df4d89ef0df0de8d6cc1b8a521" exitCode=0 Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.099556 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerDied","Data":"e28fef83921bba493531f6df229961eb19d003df4d89ef0df0de8d6cc1b8a521"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.105435 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" event={"ID":"60748893-12de-4ee9-9a98-d2e6117d2247","Type":"ContainerStarted","Data":"2a19bd054b1f9dbfbbbdb68ed5d00c5dc33b522a8f762339366f107d93efaf66"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.105523 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" event={"ID":"60748893-12de-4ee9-9a98-d2e6117d2247","Type":"ContainerStarted","Data":"eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08"} Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.116952 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" podStartSLOduration=141.116934159 podStartE2EDuration="2m21.116934159s" podCreationTimestamp="2025-12-02 22:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:01.115067214 +0000 UTC m=+163.995747215" watchObservedRunningTime="2025-12-02 22:45:01.116934159 +0000 UTC m=+163.997614160" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.154365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjg6v\" (UniqueName: \"kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.154432 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.154481 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.156971 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.157109 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.165305 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.170961 4696 patch_prober.go:28] interesting pod/router-default-5444994796-qpwvl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 22:45:01 crc kubenswrapper[4696]: [-]has-synced failed: reason withheld Dec 02 22:45:01 crc kubenswrapper[4696]: [+]process-running ok Dec 02 22:45:01 crc kubenswrapper[4696]: healthz check failed Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.171005 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qpwvl" podUID="fb0f1308-0862-4ac2-b741-849ae33c2776" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:45:01 crc kubenswrapper[4696]: W1202 22:45:01.176763 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7e1f5a_e61a_4603_b7d6_a7baccc8c59c.slice/crio-1e66bc67dae22288e742756e26a2cecee8a1b44cc9ed22bba6df5e39088e6e5e WatchSource:0}: Error finding container 1e66bc67dae22288e742756e26a2cecee8a1b44cc9ed22bba6df5e39088e6e5e: Status 404 returned error can't find the container with id 1e66bc67dae22288e742756e26a2cecee8a1b44cc9ed22bba6df5e39088e6e5e Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.188285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjg6v\" (UniqueName: \"kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v\") pod \"redhat-marketplace-rwnxp\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.200727 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" podStartSLOduration=1.200703976 podStartE2EDuration="1.200703976s" podCreationTimestamp="2025-12-02 22:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:01.20047785 +0000 UTC m=+164.081157861" watchObservedRunningTime="2025-12-02 22:45:01.200703976 +0000 UTC m=+164.081383977" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.292501 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.446458 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.751250 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp"] Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.781281 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.783006 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.786015 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.786526 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.870828 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.870907 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2mv\" (UniqueName: \"kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.870946 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.949911 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.963145 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-984hk" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.973507 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.973583 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2mv\" (UniqueName: \"kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.973634 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.974052 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:01 crc kubenswrapper[4696]: I1202 22:45:01.974366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.003405 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2mv\" (UniqueName: \"kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv\") pod \"redhat-operators-kf9wn\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.127988 4696 generic.go:334] "Generic (PLEG): container finished" podID="60748893-12de-4ee9-9a98-d2e6117d2247" containerID="2a19bd054b1f9dbfbbbdb68ed5d00c5dc33b522a8f762339366f107d93efaf66" exitCode=0 Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.128440 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" event={"ID":"60748893-12de-4ee9-9a98-d2e6117d2247","Type":"ContainerDied","Data":"2a19bd054b1f9dbfbbbdb68ed5d00c5dc33b522a8f762339366f107d93efaf66"} Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.132180 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerID="c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166" exitCode=0 Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.132281 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerDied","Data":"c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166"} Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.132373 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerStarted","Data":"1e66bc67dae22288e742756e26a2cecee8a1b44cc9ed22bba6df5e39088e6e5e"} Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.136017 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwnxp" event={"ID":"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b","Type":"ContainerStarted","Data":"126c98eaeb38f35cadfb8f8f6fe53f333e95b6181e27b34af760d269d2106706"} Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.137286 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.169249 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.172531 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w765j"] Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.177492 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.177767 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.183942 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w765j"] Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.192108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.283058 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.283630 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.283658 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq2cv\" (UniqueName: \"kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.385369 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.385476 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.385505 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq2cv\" (UniqueName: \"kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.386022 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.386040 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.417043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq2cv\" (UniqueName: \"kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv\") pod \"redhat-operators-w765j\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.437177 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.510540 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.524497 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.587278 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir\") pod \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.587405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3d275d4f-c887-4a23-a84e-e1d7305a5ba0" (UID: "3d275d4f-c887-4a23-a84e-e1d7305a5ba0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.587446 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access\") pod \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\" (UID: \"3d275d4f-c887-4a23-a84e-e1d7305a5ba0\") " Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.587793 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.587863 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.592877 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3d275d4f-c887-4a23-a84e-e1d7305a5ba0" (UID: "3d275d4f-c887-4a23-a84e-e1d7305a5ba0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.620297 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad00195c-ef4c-4d9b-941c-d01ebc498593-metrics-certs\") pod \"network-metrics-daemon-q9bfc\" (UID: \"ad00195c-ef4c-4d9b-941c-d01ebc498593\") " pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.646606 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9bfc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.683587 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 22:45:02 crc kubenswrapper[4696]: E1202 22:45:02.683919 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d275d4f-c887-4a23-a84e-e1d7305a5ba0" containerName="pruner" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.683940 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d275d4f-c887-4a23-a84e-e1d7305a5ba0" containerName="pruner" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.684054 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d275d4f-c887-4a23-a84e-e1d7305a5ba0" containerName="pruner" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.684507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.686274 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.688380 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.689387 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.690357 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d275d4f-c887-4a23-a84e-e1d7305a5ba0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.793622 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.793696 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.895549 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.895637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.895914 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:02 crc kubenswrapper[4696]: I1202 22:45:02.920649 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.012557 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.041387 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w765j"] Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.150435 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q9bfc"] Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.168927 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerStarted","Data":"7dacd0610bf08a05e5954fdfe41064852a1012fd08b0cf513909ca664d658b7a"} Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.173359 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d275d4f-c887-4a23-a84e-e1d7305a5ba0","Type":"ContainerDied","Data":"1048093b38554d535b9527a474c7a13cbf37c9c5b8bf06c39d745107aa396d5a"} Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.173417 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1048093b38554d535b9527a474c7a13cbf37c9c5b8bf06c39d745107aa396d5a" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.173511 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.181944 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerStarted","Data":"a4ba8d9163480ed6b9c9feb5630b7d8092b46a5fae8fba759134ca2666e54399"} Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.184048 4696 generic.go:334] "Generic (PLEG): container finished" podID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerID="cd8f53b1bc280a95b7a1194ce002e99b0fa58ba6c09cd44516167db9ad2c01e1" exitCode=0 Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.184962 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwnxp" event={"ID":"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b","Type":"ContainerDied","Data":"cd8f53b1bc280a95b7a1194ce002e99b0fa58ba6c09cd44516167db9ad2c01e1"} Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.189555 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qpwvl" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.514264 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.584011 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.620620 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume\") pod \"60748893-12de-4ee9-9a98-d2e6117d2247\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.620809 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume\") pod \"60748893-12de-4ee9-9a98-d2e6117d2247\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.621852 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6bmf\" (UniqueName: \"kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf\") pod \"60748893-12de-4ee9-9a98-d2e6117d2247\" (UID: \"60748893-12de-4ee9-9a98-d2e6117d2247\") " Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.622005 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume" (OuterVolumeSpecName: "config-volume") pod "60748893-12de-4ee9-9a98-d2e6117d2247" (UID: "60748893-12de-4ee9-9a98-d2e6117d2247"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.622476 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60748893-12de-4ee9-9a98-d2e6117d2247-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.630963 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60748893-12de-4ee9-9a98-d2e6117d2247" (UID: "60748893-12de-4ee9-9a98-d2e6117d2247"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.631043 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf" (OuterVolumeSpecName: "kube-api-access-g6bmf") pod "60748893-12de-4ee9-9a98-d2e6117d2247" (UID: "60748893-12de-4ee9-9a98-d2e6117d2247"). InnerVolumeSpecName "kube-api-access-g6bmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.734570 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60748893-12de-4ee9-9a98-d2e6117d2247-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:03 crc kubenswrapper[4696]: I1202 22:45:03.735019 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6bmf\" (UniqueName: \"kubernetes.io/projected/60748893-12de-4ee9-9a98-d2e6117d2247-kube-api-access-g6bmf\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:04 crc kubenswrapper[4696]: I1202 22:45:04.197038 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" event={"ID":"ad00195c-ef4c-4d9b-941c-d01ebc498593","Type":"ContainerStarted","Data":"23076c181a1c2687c121c7671598f8c0df0d2af8bb2f5cb1f52ecb06c31ad7cf"} Dec 02 22:45:04 crc kubenswrapper[4696]: I1202 22:45:04.202806 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"12bfb4e9-8a6d-408f-bd43-922181316393","Type":"ContainerStarted","Data":"d48207b93bbe3dff05eb30002a361cebf9b2abd034139d82d1fd5fefff328cde"} Dec 02 22:45:04 crc kubenswrapper[4696]: I1202 22:45:04.209209 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" event={"ID":"60748893-12de-4ee9-9a98-d2e6117d2247","Type":"ContainerDied","Data":"eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08"} Dec 02 22:45:04 crc kubenswrapper[4696]: I1202 22:45:04.209595 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eafd90b721caa8bccd1ff7cd22abf7d2a753db9efeab687ecf0605c938774d08" Dec 02 22:45:04 crc kubenswrapper[4696]: I1202 22:45:04.209230 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7" Dec 02 22:45:05 crc kubenswrapper[4696]: I1202 22:45:05.222038 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerStarted","Data":"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4"} Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.237922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" event={"ID":"ad00195c-ef4c-4d9b-941c-d01ebc498593","Type":"ContainerStarted","Data":"07e5d214647033cf41b3879dadb2ab39e836e4f0d3c8637064bc12276ae7a27f"} Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.240215 4696 generic.go:334] "Generic (PLEG): container finished" podID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerID="ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf" exitCode=0 Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.240282 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerDied","Data":"ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf"} Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.242805 4696 generic.go:334] "Generic (PLEG): container finished" podID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerID="3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4" exitCode=0 Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.242863 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerDied","Data":"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4"} Dec 02 22:45:06 crc kubenswrapper[4696]: I1202 22:45:06.858048 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4jgzb" Dec 02 22:45:07 crc kubenswrapper[4696]: I1202 22:45:07.253881 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"12bfb4e9-8a6d-408f-bd43-922181316393","Type":"ContainerStarted","Data":"69ca00fa11a2f9e1f6db371152bf44a3d3d0643d2a28f2b5f5a38543994728ec"} Dec 02 22:45:07 crc kubenswrapper[4696]: I1202 22:45:07.276268 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.276241889 podStartE2EDuration="5.276241889s" podCreationTimestamp="2025-12-02 22:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:07.275561929 +0000 UTC m=+170.156241930" watchObservedRunningTime="2025-12-02 22:45:07.276241889 +0000 UTC m=+170.156921890" Dec 02 22:45:08 crc kubenswrapper[4696]: I1202 22:45:08.271601 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9bfc" event={"ID":"ad00195c-ef4c-4d9b-941c-d01ebc498593","Type":"ContainerStarted","Data":"a6be2a1e1630e8e6c4be937c5ba77ebecc058d8f3f2afe6c647c0889b40b95f8"} Dec 02 22:45:08 crc kubenswrapper[4696]: I1202 22:45:08.275140 4696 generic.go:334] "Generic (PLEG): container finished" podID="12bfb4e9-8a6d-408f-bd43-922181316393" containerID="69ca00fa11a2f9e1f6db371152bf44a3d3d0643d2a28f2b5f5a38543994728ec" exitCode=0 Dec 02 22:45:08 crc kubenswrapper[4696]: I1202 22:45:08.275171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"12bfb4e9-8a6d-408f-bd43-922181316393","Type":"ContainerDied","Data":"69ca00fa11a2f9e1f6db371152bf44a3d3d0643d2a28f2b5f5a38543994728ec"} Dec 02 22:45:09 crc kubenswrapper[4696]: I1202 22:45:09.303690 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q9bfc" podStartSLOduration=150.303670493 podStartE2EDuration="2m30.303670493s" podCreationTimestamp="2025-12-02 22:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:09.302068765 +0000 UTC m=+172.182748766" watchObservedRunningTime="2025-12-02 22:45:09.303670493 +0000 UTC m=+172.184350494" Dec 02 22:45:09 crc kubenswrapper[4696]: I1202 22:45:09.970845 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jhcjw" Dec 02 22:45:10 crc kubenswrapper[4696]: I1202 22:45:10.227650 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:45:10 crc kubenswrapper[4696]: I1202 22:45:10.231548 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.487947 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.540863 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir\") pod \"12bfb4e9-8a6d-408f-bd43-922181316393\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.540995 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "12bfb4e9-8a6d-408f-bd43-922181316393" (UID: "12bfb4e9-8a6d-408f-bd43-922181316393"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.541592 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access\") pod \"12bfb4e9-8a6d-408f-bd43-922181316393\" (UID: \"12bfb4e9-8a6d-408f-bd43-922181316393\") " Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.542038 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12bfb4e9-8a6d-408f-bd43-922181316393-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.549924 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "12bfb4e9-8a6d-408f-bd43-922181316393" (UID: "12bfb4e9-8a6d-408f-bd43-922181316393"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:45:15 crc kubenswrapper[4696]: I1202 22:45:15.642976 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12bfb4e9-8a6d-408f-bd43-922181316393-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:16 crc kubenswrapper[4696]: I1202 22:45:16.348657 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"12bfb4e9-8a6d-408f-bd43-922181316393","Type":"ContainerDied","Data":"d48207b93bbe3dff05eb30002a361cebf9b2abd034139d82d1fd5fefff328cde"} Dec 02 22:45:16 crc kubenswrapper[4696]: I1202 22:45:16.348751 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d48207b93bbe3dff05eb30002a361cebf9b2abd034139d82d1fd5fefff328cde" Dec 02 22:45:16 crc kubenswrapper[4696]: I1202 22:45:16.348881 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 22:45:20 crc kubenswrapper[4696]: I1202 22:45:20.179671 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:45:22 crc kubenswrapper[4696]: I1202 22:45:22.973844 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:45:22 crc kubenswrapper[4696]: I1202 22:45:22.973945 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:45:26 crc kubenswrapper[4696]: I1202 22:45:26.880543 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:45:32 crc kubenswrapper[4696]: I1202 22:45:32.283142 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6vtkj" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.709558 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 22:45:40 crc kubenswrapper[4696]: E1202 22:45:40.710563 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bfb4e9-8a6d-408f-bd43-922181316393" containerName="pruner" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.710578 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bfb4e9-8a6d-408f-bd43-922181316393" containerName="pruner" Dec 02 22:45:40 crc kubenswrapper[4696]: E1202 22:45:40.710589 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60748893-12de-4ee9-9a98-d2e6117d2247" containerName="collect-profiles" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.710595 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="60748893-12de-4ee9-9a98-d2e6117d2247" containerName="collect-profiles" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.710707 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bfb4e9-8a6d-408f-bd43-922181316393" containerName="pruner" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.710716 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="60748893-12de-4ee9-9a98-d2e6117d2247" containerName="collect-profiles" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.711158 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.717318 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.717380 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.718261 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.831626 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.832190 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.933302 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.933805 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:40 crc kubenswrapper[4696]: I1202 22:45:40.933895 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:41 crc kubenswrapper[4696]: I1202 22:45:41.035872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:41 crc kubenswrapper[4696]: I1202 22:45:41.143549 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.409384 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.409658 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7kzxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9rp8k_openshift-marketplace(ca4122e2-7532-4ca9-a111-9097cfae1dde): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.411053 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9rp8k" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.673703 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.674274 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wtht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xhnsb_openshift-marketplace(4b259842-7ad5-413d-a13c-73c0931b6527): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:42 crc kubenswrapper[4696]: E1202 22:45:42.675525 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xhnsb" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" Dec 02 22:45:43 crc kubenswrapper[4696]: E1202 22:45:43.249753 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9rp8k" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" Dec 02 22:45:43 crc kubenswrapper[4696]: E1202 22:45:43.313950 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 22:45:43 crc kubenswrapper[4696]: E1202 22:45:43.314151 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvxrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6ccj5_openshift-marketplace(ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:43 crc kubenswrapper[4696]: E1202 22:45:43.315339 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6ccj5" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.099880 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6ccj5" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.100790 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xhnsb" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.119534 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.119898 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjg6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rwnxp_openshift-marketplace(a04370b2-4b6e-4eac-9ffe-4f2e7116b03b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.121158 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rwnxp" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.353523 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.353823 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmvpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-42w2d_openshift-marketplace(4eb44562-c608-428a-8812-428de40cbcde): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.355126 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-42w2d" podUID="4eb44562-c608-428a-8812-428de40cbcde" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.449814 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.450009 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fckxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8jbfs_openshift-marketplace(aeaa03bb-5cf7-4a89-b182-36358f9e247c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:45 crc kubenswrapper[4696]: E1202 22:45:45.451945 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8jbfs" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.500239 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.501115 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.510146 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.617882 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.618001 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.618039 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.720476 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.720538 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.720623 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.720736 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.720811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.749638 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access\") pod \"installer-9-crc\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:45 crc kubenswrapper[4696]: I1202 22:45:45.830352 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.032301 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-42w2d" podUID="4eb44562-c608-428a-8812-428de40cbcde" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.032336 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rwnxp" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.032574 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8jbfs" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.050368 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.056018 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gq2cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w765j_openshift-marketplace(2b3474b0-8824-4cb5-8ef3-c459af98ed02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.058111 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-w765j" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.091234 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.091929 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jz2mv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kf9wn_openshift-marketplace(e2e0ae4b-0dce-4a74-8d3c-05635e86392b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:45:51 crc kubenswrapper[4696]: E1202 22:45:51.095294 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kf9wn" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" Dec 02 22:45:51 crc kubenswrapper[4696]: I1202 22:45:51.401249 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 22:45:51 crc kubenswrapper[4696]: I1202 22:45:51.570309 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 22:45:51 crc kubenswrapper[4696]: I1202 22:45:51.591232 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"142e360e-5c5a-42af-b077-f75a807dea45","Type":"ContainerStarted","Data":"3973f0a34e5336a7a656b4b2cb667c487f41cd75797fb74f986e034691cacfb1"} Dec 02 22:45:51 crc kubenswrapper[4696]: W1202 22:45:51.597398 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf338571d_9677_4326_be0b_4608419b96ff.slice/crio-4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde WatchSource:0}: Error finding container 4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde: Status 404 returned error can't find the container with id 4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.597979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f338571d-9677-4326-be0b-4608419b96ff","Type":"ContainerStarted","Data":"4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde"} Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.978044 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.978467 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.978580 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.980009 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 22:45:52 crc kubenswrapper[4696]: I1202 22:45:52.980239 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4" gracePeriod=600 Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.605763 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f338571d-9677-4326-be0b-4608419b96ff","Type":"ContainerStarted","Data":"e357987392686bc9e08c07ba9631f47d0944133d85da1be78073884dfd725f1f"} Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.609143 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4" exitCode=0 Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.609300 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4"} Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.613655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"142e360e-5c5a-42af-b077-f75a807dea45","Type":"ContainerStarted","Data":"9444a6325c244668ee70866778a5c974341a770133e0a5ccce49012feb6bdd86"} Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.627662 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.627633252999999 podStartE2EDuration="13.627633253s" podCreationTimestamp="2025-12-02 22:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:53.625999702 +0000 UTC m=+216.506679743" watchObservedRunningTime="2025-12-02 22:45:53.627633253 +0000 UTC m=+216.508313254" Dec 02 22:45:53 crc kubenswrapper[4696]: I1202 22:45:53.663279 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.663242628999999 podStartE2EDuration="8.663242629s" podCreationTimestamp="2025-12-02 22:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:45:53.656915612 +0000 UTC m=+216.537595623" watchObservedRunningTime="2025-12-02 22:45:53.663242629 +0000 UTC m=+216.543922660" Dec 02 22:45:55 crc kubenswrapper[4696]: I1202 22:45:55.638061 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f"} Dec 02 22:45:55 crc kubenswrapper[4696]: I1202 22:45:55.643180 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerStarted","Data":"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f"} Dec 02 22:45:55 crc kubenswrapper[4696]: I1202 22:45:55.647657 4696 generic.go:334] "Generic (PLEG): container finished" podID="f338571d-9677-4326-be0b-4608419b96ff" containerID="e357987392686bc9e08c07ba9631f47d0944133d85da1be78073884dfd725f1f" exitCode=0 Dec 02 22:45:55 crc kubenswrapper[4696]: I1202 22:45:55.647702 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f338571d-9677-4326-be0b-4608419b96ff","Type":"ContainerDied","Data":"e357987392686bc9e08c07ba9631f47d0944133d85da1be78073884dfd725f1f"} Dec 02 22:45:56 crc kubenswrapper[4696]: I1202 22:45:56.654904 4696 generic.go:334] "Generic (PLEG): container finished" podID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerID="d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f" exitCode=0 Dec 02 22:45:56 crc kubenswrapper[4696]: I1202 22:45:56.654987 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerDied","Data":"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f"} Dec 02 22:45:56 crc kubenswrapper[4696]: I1202 22:45:56.905864 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.018627 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f338571d-9677-4326-be0b-4608419b96ff" (UID: "f338571d-9677-4326-be0b-4608419b96ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.018715 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir\") pod \"f338571d-9677-4326-be0b-4608419b96ff\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.019101 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access\") pod \"f338571d-9677-4326-be0b-4608419b96ff\" (UID: \"f338571d-9677-4326-be0b-4608419b96ff\") " Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.021616 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f338571d-9677-4326-be0b-4608419b96ff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.039055 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f338571d-9677-4326-be0b-4608419b96ff" (UID: "f338571d-9677-4326-be0b-4608419b96ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.123491 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f338571d-9677-4326-be0b-4608419b96ff-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.666715 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f338571d-9677-4326-be0b-4608419b96ff","Type":"ContainerDied","Data":"4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde"} Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.666790 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f279fef265666f5f89dad7d6943eb8370ae7ebe0de83b43bd70937f05781fde" Dec 02 22:45:57 crc kubenswrapper[4696]: I1202 22:45:57.666862 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 22:46:00 crc kubenswrapper[4696]: I1202 22:46:00.688192 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerStarted","Data":"0838dc5ea48f8ffb4ce2a96f34d93cc8c8c5990eb3c65d33967ba59f45131132"} Dec 02 22:46:00 crc kubenswrapper[4696]: I1202 22:46:00.690076 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerStarted","Data":"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc"} Dec 02 22:46:00 crc kubenswrapper[4696]: I1202 22:46:00.692282 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerStarted","Data":"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935"} Dec 02 22:46:00 crc kubenswrapper[4696]: I1202 22:46:00.746832 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9rp8k" podStartSLOduration=3.940469762 podStartE2EDuration="1m2.746807145s" podCreationTimestamp="2025-12-02 22:44:58 +0000 UTC" firstStartedPulling="2025-12-02 22:45:01.076999268 +0000 UTC m=+163.957679269" lastFinishedPulling="2025-12-02 22:45:59.883336651 +0000 UTC m=+222.764016652" observedRunningTime="2025-12-02 22:46:00.745935748 +0000 UTC m=+223.626615769" watchObservedRunningTime="2025-12-02 22:46:00.746807145 +0000 UTC m=+223.627487156" Dec 02 22:46:01 crc kubenswrapper[4696]: I1202 22:46:01.702104 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerID="443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc" exitCode=0 Dec 02 22:46:01 crc kubenswrapper[4696]: I1202 22:46:01.702543 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerDied","Data":"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc"} Dec 02 22:46:01 crc kubenswrapper[4696]: I1202 22:46:01.705572 4696 generic.go:334] "Generic (PLEG): container finished" podID="4b259842-7ad5-413d-a13c-73c0931b6527" containerID="0838dc5ea48f8ffb4ce2a96f34d93cc8c8c5990eb3c65d33967ba59f45131132" exitCode=0 Dec 02 22:46:01 crc kubenswrapper[4696]: I1202 22:46:01.705642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerDied","Data":"0838dc5ea48f8ffb4ce2a96f34d93cc8c8c5990eb3c65d33967ba59f45131132"} Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.724536 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerStarted","Data":"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2"} Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.727079 4696 generic.go:334] "Generic (PLEG): container finished" podID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerID="39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da" exitCode=0 Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.727131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerDied","Data":"39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da"} Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.735143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerStarted","Data":"5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0"} Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.748405 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6ccj5" podStartSLOduration=3.172356306 podStartE2EDuration="1m3.748375076s" podCreationTimestamp="2025-12-02 22:45:00 +0000 UTC" firstStartedPulling="2025-12-02 22:45:02.13400776 +0000 UTC m=+165.014687751" lastFinishedPulling="2025-12-02 22:46:02.71002652 +0000 UTC m=+225.590706521" observedRunningTime="2025-12-02 22:46:03.745102304 +0000 UTC m=+226.625782305" watchObservedRunningTime="2025-12-02 22:46:03.748375076 +0000 UTC m=+226.629055077" Dec 02 22:46:03 crc kubenswrapper[4696]: I1202 22:46:03.768003 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xhnsb" podStartSLOduration=3.153211733 podStartE2EDuration="1m4.767980695s" podCreationTimestamp="2025-12-02 22:44:59 +0000 UTC" firstStartedPulling="2025-12-02 22:45:01.087771896 +0000 UTC m=+163.968451897" lastFinishedPulling="2025-12-02 22:46:02.702540858 +0000 UTC m=+225.583220859" observedRunningTime="2025-12-02 22:46:03.766229531 +0000 UTC m=+226.646909532" watchObservedRunningTime="2025-12-02 22:46:03.767980695 +0000 UTC m=+226.648660696" Dec 02 22:46:04 crc kubenswrapper[4696]: I1202 22:46:04.741286 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerStarted","Data":"e180a0236e0fd3d800deac73a4b68e072155b1bc642b53b9b135e6ebdd41ebd5"} Dec 02 22:46:04 crc kubenswrapper[4696]: I1202 22:46:04.744648 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerStarted","Data":"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68"} Dec 02 22:46:04 crc kubenswrapper[4696]: I1202 22:46:04.789921 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jbfs" podStartSLOduration=2.549035713 podStartE2EDuration="1m6.78989914s" podCreationTimestamp="2025-12-02 22:44:58 +0000 UTC" firstStartedPulling="2025-12-02 22:45:00.045051737 +0000 UTC m=+162.925731738" lastFinishedPulling="2025-12-02 22:46:04.285915164 +0000 UTC m=+227.166595165" observedRunningTime="2025-12-02 22:46:04.787611909 +0000 UTC m=+227.668291910" watchObservedRunningTime="2025-12-02 22:46:04.78989914 +0000 UTC m=+227.670579141" Dec 02 22:46:05 crc kubenswrapper[4696]: I1202 22:46:05.751481 4696 generic.go:334] "Generic (PLEG): container finished" podID="4eb44562-c608-428a-8812-428de40cbcde" containerID="e180a0236e0fd3d800deac73a4b68e072155b1bc642b53b9b135e6ebdd41ebd5" exitCode=0 Dec 02 22:46:05 crc kubenswrapper[4696]: I1202 22:46:05.751556 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerDied","Data":"e180a0236e0fd3d800deac73a4b68e072155b1bc642b53b9b135e6ebdd41ebd5"} Dec 02 22:46:08 crc kubenswrapper[4696]: I1202 22:46:08.986572 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:46:08 crc kubenswrapper[4696]: I1202 22:46:08.988041 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.051165 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.110201 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.110259 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.151629 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.503764 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.503840 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.558761 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.822538 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.823169 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:46:09 crc kubenswrapper[4696]: I1202 22:46:09.861407 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:46:10 crc kubenswrapper[4696]: I1202 22:46:10.929862 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:46:10 crc kubenswrapper[4696]: I1202 22:46:10.929931 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:46:10 crc kubenswrapper[4696]: I1202 22:46:10.970020 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:46:11 crc kubenswrapper[4696]: I1202 22:46:11.849865 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:46:12 crc kubenswrapper[4696]: I1202 22:46:12.113061 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhnsb"] Dec 02 22:46:12 crc kubenswrapper[4696]: I1202 22:46:12.113812 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xhnsb" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" containerID="cri-o://5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" gracePeriod=2 Dec 02 22:46:19 crc kubenswrapper[4696]: E1202 22:46:19.504246 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:19 crc kubenswrapper[4696]: E1202 22:46:19.506591 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:19 crc kubenswrapper[4696]: E1202 22:46:19.507324 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:19 crc kubenswrapper[4696]: E1202 22:46:19.507370 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-xhnsb" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" Dec 02 22:46:19 crc kubenswrapper[4696]: I1202 22:46:19.831810 4696 generic.go:334] "Generic (PLEG): container finished" podID="4b259842-7ad5-413d-a13c-73c0931b6527" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" exitCode=0 Dec 02 22:46:19 crc kubenswrapper[4696]: I1202 22:46:19.831867 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerDied","Data":"5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0"} Dec 02 22:46:29 crc kubenswrapper[4696]: E1202 22:46:29.504722 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:29 crc kubenswrapper[4696]: E1202 22:46:29.506670 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:29 crc kubenswrapper[4696]: E1202 22:46:29.507893 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:46:29 crc kubenswrapper[4696]: E1202 22:46:29.507988 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-xhnsb" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.687448 4696 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.688252 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53" gracePeriod=15 Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.688466 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d" gracePeriod=15 Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.688517 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e" gracePeriod=15 Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.688577 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad" gracePeriod=15 Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.688674 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f" gracePeriod=15 Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.694508 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.694973 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.694994 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695011 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695017 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695029 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695038 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695046 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695052 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695214 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695222 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695229 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695237 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695249 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695256 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 22:46:30 crc kubenswrapper[4696]: E1202 22:46:30.695264 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f338571d-9677-4326-be0b-4608419b96ff" containerName="pruner" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695270 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f338571d-9677-4326-be0b-4608419b96ff" containerName="pruner" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695379 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695395 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f338571d-9677-4326-be0b-4608419b96ff" containerName="pruner" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695405 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695415 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695424 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695430 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.695438 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.697187 4696 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.697722 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.702716 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.710308 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6k2l"] Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.878647 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879213 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879294 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879335 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879351 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879376 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879399 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.879418 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981340 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981425 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981435 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981515 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981543 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981567 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981597 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981625 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981664 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981645 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981688 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981761 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981598 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:30 crc kubenswrapper[4696]: I1202 22:46:30.981374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.488343 4696 generic.go:334] "Generic (PLEG): container finished" podID="142e360e-5c5a-42af-b077-f75a807dea45" containerID="9444a6325c244668ee70866778a5c974341a770133e0a5ccce49012feb6bdd86" exitCode=0 Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.488419 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"142e360e-5c5a-42af-b077-f75a807dea45","Type":"ContainerDied","Data":"9444a6325c244668ee70866778a5c974341a770133e0a5ccce49012feb6bdd86"} Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.490157 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.491874 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493175 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493803 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d" exitCode=0 Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493826 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e" exitCode=0 Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493836 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad" exitCode=0 Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493844 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f" exitCode=2 Dec 02 22:46:34 crc kubenswrapper[4696]: I1202 22:46:34.493886 4696 scope.go:117] "RemoveContainer" containerID="6268352b995d99d383ab500b5ccd80b0612cf33716a752aeb1d92cb775a1971e" Dec 02 22:46:35 crc kubenswrapper[4696]: I1202 22:46:35.505049 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 22:46:35 crc kubenswrapper[4696]: I1202 22:46:35.506198 4696 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53" exitCode=0 Dec 02 22:46:35 crc kubenswrapper[4696]: E1202 22:46:35.789103 4696 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:35 crc kubenswrapper[4696]: I1202 22:46:35.790596 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.435661 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.620807 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.622081 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.730435 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir\") pod \"142e360e-5c5a-42af-b077-f75a807dea45\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.730518 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock\") pod \"142e360e-5c5a-42af-b077-f75a807dea45\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.730568 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access\") pod \"142e360e-5c5a-42af-b077-f75a807dea45\" (UID: \"142e360e-5c5a-42af-b077-f75a807dea45\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.730670 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "142e360e-5c5a-42af-b077-f75a807dea45" (UID: "142e360e-5c5a-42af-b077-f75a807dea45"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.731550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock" (OuterVolumeSpecName: "var-lock") pod "142e360e-5c5a-42af-b077-f75a807dea45" (UID: "142e360e-5c5a-42af-b077-f75a807dea45"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.731574 4696 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.745222 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "142e360e-5c5a-42af-b077-f75a807dea45" (UID: "142e360e-5c5a-42af-b077-f75a807dea45"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.833941 4696 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/142e360e-5c5a-42af-b077-f75a807dea45-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.834011 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/142e360e-5c5a-42af-b077-f75a807dea45-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.841611 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.841763 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.842706 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.843015 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.843823 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.844312 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.845871 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.846314 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:37 crc kubenswrapper[4696]: E1202 22:46:37.923539 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-42w2d.187d877c87c5bc7c openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-42w2d,UID:4eb44562-c608-428a-8812-428de40cbcde,APIVersion:v1,ResourceVersion:28404,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 32.169s (32.169s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 22:46:37.922679932 +0000 UTC m=+260.803359933,LastTimestamp:2025-12-02 22:46:37.922679932 +0000 UTC m=+260.803359933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.934968 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935075 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935117 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities\") pod \"4b259842-7ad5-413d-a13c-73c0931b6527\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935145 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content\") pod \"4b259842-7ad5-413d-a13c-73c0931b6527\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935241 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtht\" (UniqueName: \"kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht\") pod \"4b259842-7ad5-413d-a13c-73c0931b6527\" (UID: \"4b259842-7ad5-413d-a13c-73c0931b6527\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935249 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935287 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935337 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935938 4696 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935965 4696 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.935974 4696 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.936360 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities" (OuterVolumeSpecName: "utilities") pod "4b259842-7ad5-413d-a13c-73c0931b6527" (UID: "4b259842-7ad5-413d-a13c-73c0931b6527"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.941405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht" (OuterVolumeSpecName: "kube-api-access-4wtht") pod "4b259842-7ad5-413d-a13c-73c0931b6527" (UID: "4b259842-7ad5-413d-a13c-73c0931b6527"). InnerVolumeSpecName "kube-api-access-4wtht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: I1202 22:46:37.988722 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b259842-7ad5-413d-a13c-73c0931b6527" (UID: "4b259842-7ad5-413d-a13c-73c0931b6527"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:46:37 crc kubenswrapper[4696]: W1202 22:46:37.990862 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-9d3756650022ecff0cc57a3fcc094db9db1684e28c2dcfd8305a492180867b25 WatchSource:0}: Error finding container 9d3756650022ecff0cc57a3fcc094db9db1684e28c2dcfd8305a492180867b25: Status 404 returned error can't find the container with id 9d3756650022ecff0cc57a3fcc094db9db1684e28c2dcfd8305a492180867b25 Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.037774 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtht\" (UniqueName: \"kubernetes.io/projected/4b259842-7ad5-413d-a13c-73c0931b6527-kube-api-access-4wtht\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.037809 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.037822 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b259842-7ad5-413d-a13c-73c0931b6527-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.526722 4696 generic.go:334] "Generic (PLEG): container finished" podID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerID="fd9405589caab0e36654d391421562c91e178f1d4eedf741b443102f64c75725" exitCode=0 Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.526795 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwnxp" event={"ID":"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b","Type":"ContainerDied","Data":"fd9405589caab0e36654d391421562c91e178f1d4eedf741b443102f64c75725"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.528346 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.528372 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.528771 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.528963 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.529135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"142e360e-5c5a-42af-b077-f75a807dea45","Type":"ContainerDied","Data":"3973f0a34e5336a7a656b4b2cb667c487f41cd75797fb74f986e034691cacfb1"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.529167 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3973f0a34e5336a7a656b4b2cb667c487f41cd75797fb74f986e034691cacfb1" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.529202 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.534352 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.534391 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9d3756650022ecff0cc57a3fcc094db9db1684e28c2dcfd8305a492180867b25"} Dec 02 22:46:38 crc kubenswrapper[4696]: E1202 22:46:38.534842 4696 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.535381 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.536259 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.536551 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.537021 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.540045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhnsb" event={"ID":"4b259842-7ad5-413d-a13c-73c0931b6527","Type":"ContainerDied","Data":"cabcd8086c0440a5c5ce00d2a111e829c6bfc81d0ad64544f155f2ad9b6936e1"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.540111 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhnsb" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.540115 4696 scope.go:117] "RemoveContainer" containerID="5ac32f3527fdb565307e77fc409bc3202b21b1596e22320c4c5ce45f455172c0" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.541107 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.541396 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.541682 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.542476 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.543132 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerStarted","Data":"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.543784 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.544280 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.544543 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.544884 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.545151 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.546174 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.547081 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.553369 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.553663 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.555029 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.555581 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.555916 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.557139 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerStarted","Data":"f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.558056 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.558472 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.559876 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.559898 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerStarted","Data":"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4"} Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.560213 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.560432 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.560668 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.560953 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.561147 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.561354 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.561561 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.561806 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.562067 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.562448 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.596521 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.596794 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.597147 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.597695 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.597957 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.598118 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.598263 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.598447 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.598975 4696 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.599145 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.599313 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.599542 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.599720 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.600159 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.610026 4696 scope.go:117] "RemoveContainer" containerID="0838dc5ea48f8ffb4ce2a96f34d93cc8c8c5990eb3c65d33967ba59f45131132" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.633011 4696 scope.go:117] "RemoveContainer" containerID="1e16d8d26bb7406919209731c1caca523976645570dc7f95a1af9014a627a8e4" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.654910 4696 scope.go:117] "RemoveContainer" containerID="9d92ce9d53d9367503c76c5fbb45f340a5ddd93f3c8e1738ac434fe7b0210d1d" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.678991 4696 scope.go:117] "RemoveContainer" containerID="666f81c81a7cad67c01b2aebf2d05f162c33feea27f64c12f92761d6d2a6056e" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.691849 4696 scope.go:117] "RemoveContainer" containerID="ce744ea765d5bbcb0a9926f439a0256774c16863344f23ddfb0074b4ccbdb4ad" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.703505 4696 scope.go:117] "RemoveContainer" containerID="1108d710330b056e2228205b2012df070ffbd660701ddd577eafbf0e99c3bf9f" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.721072 4696 scope.go:117] "RemoveContainer" containerID="d3112751f86a62634d02f281d467d3eeb27878e27c245871210cca37eaeb6e53" Dec 02 22:46:38 crc kubenswrapper[4696]: I1202 22:46:38.743293 4696 scope.go:117] "RemoveContainer" containerID="055bef75732eec4908f0baf8a52cd3921fab03cf88e36d2e0d442344fb1af0c5" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.316309 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.316772 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.439651 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.567679 4696 generic.go:334] "Generic (PLEG): container finished" podID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerID="62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a" exitCode=0 Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.567730 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerDied","Data":"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a"} Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.569750 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.570503 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.571222 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.571488 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.572067 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.573824 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.601454 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.602191 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.602760 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.603097 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.603389 4696 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:39 crc kubenswrapper[4696]: I1202 22:46:39.603413 4696 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.603634 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Dec 02 22:46:39 crc kubenswrapper[4696]: E1202 22:46:39.805126 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.206330 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.244015 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:46:40Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:46:40Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:46:40Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:46:40Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[],\\\"sizeBytes\\\":1609784638},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1f2a98779239dfafa38d3fb89250a2691f75894c155b5c43fcc421a653bf9273\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6a549becfb2bf10c272884c5858c442eeaa5b3eb8a726dc460b0a79d0164f7ed\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204220237},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3bb6e76bb2fc875de6aae6909205aad0af8b2a476f3b7e31f64d5ae8e6659572\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:54a0b5857af1053fc62860dff0f0cb8f974ab781ba9fc5722277c34ef2a16b4e\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201277260},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.244627 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.245131 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.245632 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.246419 4696 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.246662 4696 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.351396 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-42w2d" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" probeResult="failure" output=< Dec 02 22:46:40 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 22:46:40 crc kubenswrapper[4696]: > Dec 02 22:46:40 crc kubenswrapper[4696]: E1202 22:46:40.477945 4696 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{community-operators-42w2d.187d877c87c5bc7c openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-42w2d,UID:4eb44562-c608-428a-8812-428de40cbcde,APIVersion:v1,ResourceVersion:28404,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 32.169s (32.169s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 22:46:37.922679932 +0000 UTC m=+260.803359933,LastTimestamp:2025-12-02 22:46:37.922679932 +0000 UTC m=+260.803359933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.587240 4696 generic.go:334] "Generic (PLEG): container finished" podID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerID="4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4" exitCode=0 Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.587378 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerDied","Data":"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4"} Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.589027 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.589789 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.590824 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.591477 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.592073 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:40 crc kubenswrapper[4696]: I1202 22:46:40.592596 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:41 crc kubenswrapper[4696]: E1202 22:46:41.008272 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Dec 02 22:46:42 crc kubenswrapper[4696]: E1202 22:46:42.609172 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Dec 02 22:46:44 crc kubenswrapper[4696]: I1202 22:46:44.616199 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwnxp" event={"ID":"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b","Type":"ContainerStarted","Data":"2c20a3dcbcfe62ef47620c53d267eaf34825c5ee0eeaf7e0415ce71886b4a585"} Dec 02 22:46:45 crc kubenswrapper[4696]: E1202 22:46:45.471042 4696 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" volumeName="registry-storage" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.630133 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.630218 4696 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79" exitCode=1 Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.630316 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79"} Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.631252 4696 scope.go:117] "RemoveContainer" containerID="67bf899df90f18f36310994af135552aa9758f0460117d62d00fdc1de067ce79" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.631355 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.631929 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.632476 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.633292 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.633649 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.634149 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.635260 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.635632 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.635980 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.636402 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.636800 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.637110 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: I1202 22:46:45.637575 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:45 crc kubenswrapper[4696]: E1202 22:46:45.814678 4696 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="6.4s" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.430941 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.432346 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.432893 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.433549 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.434008 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.434362 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.434689 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.435149 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.449812 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.449849 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:46 crc kubenswrapper[4696]: E1202 22:46:46.450323 4696 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.450959 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:46 crc kubenswrapper[4696]: W1202 22:46:46.478891 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f069c8a72083822d7af5a5a65719ab267bbd17fa5fb31c755c658e8aa54c36db WatchSource:0}: Error finding container f069c8a72083822d7af5a5a65719ab267bbd17fa5fb31c755c658e8aa54c36db: Status 404 returned error can't find the container with id f069c8a72083822d7af5a5a65719ab267bbd17fa5fb31c755c658e8aa54c36db Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.656883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerStarted","Data":"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f"} Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.660493 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.660957 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.661248 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.661542 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.661871 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.662138 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.662392 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.666493 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.666582 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab7fbaab5e3747f4ba2b91ed1e133c23faab1dde758031a8e6230e2c8d8b5ca1"} Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.667851 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.668177 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.668518 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.668786 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.669309 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.669449 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerStarted","Data":"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2"} Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.669937 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.670231 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.670484 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f069c8a72083822d7af5a5a65719ab267bbd17fa5fb31c755c658e8aa54c36db"} Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.670830 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.671056 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.671257 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.671751 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.672116 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.672413 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:46 crc kubenswrapper[4696]: I1202 22:46:46.672689 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.334587 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.436714 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.436997 4696 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.437417 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.438119 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.438366 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.438906 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.439480 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:47 crc kubenswrapper[4696]: I1202 22:46:47.440228 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.685853 4696 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c360942e4cbdc2a65d7ac41c5dd4f5f970eb52cafe0dc9c58db2d1ae41d73161" exitCode=0 Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.685951 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c360942e4cbdc2a65d7ac41c5dd4f5f970eb52cafe0dc9c58db2d1ae41d73161"} Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.686507 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.686571 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.687342 4696 status_manager.go:851] "Failed to get status for pod" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" pod="openshift-marketplace/redhat-marketplace-rwnxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwnxp\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: E1202 22:46:48.687364 4696 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.687997 4696 status_manager.go:851] "Failed to get status for pod" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" pod="openshift-marketplace/redhat-operators-w765j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-w765j\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.688648 4696 status_manager.go:851] "Failed to get status for pod" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" pod="openshift-marketplace/certified-operators-xhnsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xhnsb\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.689549 4696 status_manager.go:851] "Failed to get status for pod" podUID="4eb44562-c608-428a-8812-428de40cbcde" pod="openshift-marketplace/community-operators-42w2d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-42w2d\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.690077 4696 status_manager.go:851] "Failed to get status for pod" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" pod="openshift-marketplace/redhat-operators-kf9wn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kf9wn\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.691012 4696 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.691549 4696 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:48 crc kubenswrapper[4696]: I1202 22:46:48.691892 4696 status_manager.go:851] "Failed to get status for pod" podUID="142e360e-5c5a-42af-b077-f75a807dea45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Dec 02 22:46:49 crc kubenswrapper[4696]: I1202 22:46:49.379947 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:46:49 crc kubenswrapper[4696]: I1202 22:46:49.427182 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:46:49 crc kubenswrapper[4696]: I1202 22:46:49.696207 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"989fa6c251387fb14ca901e5bb6c04ea0c009f1103b04b1a9cd677b4cc6f98ad"} Dec 02 22:46:49 crc kubenswrapper[4696]: I1202 22:46:49.696251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af72161eabbff16bdcdd94075cb6987a14f10f203408e5cf542116cf75e4916e"} Dec 02 22:46:51 crc kubenswrapper[4696]: I1202 22:46:51.293937 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:46:51 crc kubenswrapper[4696]: I1202 22:46:51.294484 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:46:51 crc kubenswrapper[4696]: I1202 22:46:51.345136 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:46:51 crc kubenswrapper[4696]: I1202 22:46:51.712459 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2b8923a2e604ea1585400c6d5daa27c254f72dcb2983058cf7836866b63c1a0b"} Dec 02 22:46:51 crc kubenswrapper[4696]: I1202 22:46:51.763322 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.193346 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.193414 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.237312 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.513186 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.513776 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.570835 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.763381 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:46:52 crc kubenswrapper[4696]: I1202 22:46:52.776099 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:46:55 crc kubenswrapper[4696]: I1202 22:46:54.651392 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:46:55 crc kubenswrapper[4696]: I1202 22:46:54.659725 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:55.796355 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerName="oauth-openshift" containerID="cri-o://5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113" gracePeriod=15 Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:56.752840 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd63355cdef47d376d5ccf82b30979c24dd97f432d9f8b161f00b0824c92d7fc"} Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.343668 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.587243 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.734982 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735071 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735181 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735240 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735287 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735342 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735415 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735453 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735484 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735529 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735593 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkss8\" (UniqueName: \"kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735634 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.735672 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session\") pod \"93e854b6-0bab-4aa3-9d60-97542cd304eb\" (UID: \"93e854b6-0bab-4aa3-9d60-97542cd304eb\") " Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.736250 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.736342 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.736510 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.736787 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.737087 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.743198 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.747969 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.748419 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8" (OuterVolumeSpecName: "kube-api-access-rkss8") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "kube-api-access-rkss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.748454 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.749672 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.751698 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.752120 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.752388 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.752717 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "93e854b6-0bab-4aa3-9d60-97542cd304eb" (UID: "93e854b6-0bab-4aa3-9d60-97542cd304eb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.763375 4696 generic.go:334] "Generic (PLEG): container finished" podID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerID="5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113" exitCode=0 Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.763445 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" event={"ID":"93e854b6-0bab-4aa3-9d60-97542cd304eb","Type":"ContainerDied","Data":"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113"} Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.763489 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" event={"ID":"93e854b6-0bab-4aa3-9d60-97542cd304eb","Type":"ContainerDied","Data":"3122f045d76f7c887bb00ef8223891509c09466698ff514bbf0dbe6e8e3ccf56"} Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.763515 4696 scope.go:117] "RemoveContainer" containerID="5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.763652 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6k2l" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838609 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838656 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838673 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838723 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838780 4696 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838797 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838814 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838829 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838842 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838856 4696 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93e854b6-0bab-4aa3-9d60-97542cd304eb-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838869 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838882 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkss8\" (UniqueName: \"kubernetes.io/projected/93e854b6-0bab-4aa3-9d60-97542cd304eb-kube-api-access-rkss8\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838896 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:57 crc kubenswrapper[4696]: I1202 22:46:57.838909 4696 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93e854b6-0bab-4aa3-9d60-97542cd304eb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 22:46:58 crc kubenswrapper[4696]: I1202 22:46:58.476111 4696 scope.go:117] "RemoveContainer" containerID="5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113" Dec 02 22:46:58 crc kubenswrapper[4696]: E1202 22:46:58.476877 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113\": container with ID starting with 5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113 not found: ID does not exist" containerID="5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113" Dec 02 22:46:58 crc kubenswrapper[4696]: I1202 22:46:58.476929 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113"} err="failed to get container status \"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113\": rpc error: code = NotFound desc = could not find container \"5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113\": container with ID starting with 5d139003bf42d5bd3f336efdcae1c11aedc0f7421f3d1df24f7fbcd28e477113 not found: ID does not exist" Dec 02 22:46:58 crc kubenswrapper[4696]: I1202 22:46:58.773232 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d728de5c569f15d225d223ef3f9263c17488615423f8adc6fc180998d05d8632"} Dec 02 22:46:59 crc kubenswrapper[4696]: I1202 22:46:59.780646 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:59 crc kubenswrapper[4696]: I1202 22:46:59.780774 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:59 crc kubenswrapper[4696]: I1202 22:46:59.780812 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:46:59 crc kubenswrapper[4696]: I1202 22:46:59.789880 4696 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:46:59 crc kubenswrapper[4696]: I1202 22:46:59.801109 4696 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"158d80a5-8d90-46bb-83e7-ab3263f6d28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af72161eabbff16bdcdd94075cb6987a14f10f203408e5cf542116cf75e4916e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b8923a2e604ea1585400c6d5daa27c254f72dcb2983058cf7836866b63c1a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989fa6c251387fb14ca901e5bb6c04ea0c009f1103b04b1a9cd677b4cc6f98ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d728de5c569f15d225d223ef3f9263c17488615423f8adc6fc180998d05d8632\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd63355cdef47d376d5ccf82b30979c24dd97f432d9f8b161f00b0824c92d7fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:46:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Dec 02 22:47:00 crc kubenswrapper[4696]: I1202 22:47:00.786372 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:00 crc kubenswrapper[4696]: I1202 22:47:00.786404 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.451954 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.452014 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.459263 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.462591 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="21ac084a-4de8-45d3-9d35-e3444351199b" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.793258 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.793305 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:01 crc kubenswrapper[4696]: I1202 22:47:01.800603 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:47:02 crc kubenswrapper[4696]: I1202 22:47:02.799251 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:02 crc kubenswrapper[4696]: I1202 22:47:02.800049 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:03 crc kubenswrapper[4696]: I1202 22:47:03.808936 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:03 crc kubenswrapper[4696]: I1202 22:47:03.808994 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:05 crc kubenswrapper[4696]: I1202 22:47:05.647049 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 22:47:05 crc kubenswrapper[4696]: I1202 22:47:05.924777 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 22:47:06 crc kubenswrapper[4696]: I1202 22:47:06.178229 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 22:47:06 crc kubenswrapper[4696]: I1202 22:47:06.544672 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 22:47:06 crc kubenswrapper[4696]: I1202 22:47:06.863864 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 22:47:06 crc kubenswrapper[4696]: I1202 22:47:06.960005 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 22:47:06 crc kubenswrapper[4696]: I1202 22:47:06.971421 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.141801 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.275047 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.481312 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="21ac084a-4de8-45d3-9d35-e3444351199b" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.519885 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.523294 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 22:47:07 crc kubenswrapper[4696]: I1202 22:47:07.805582 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 22:47:08 crc kubenswrapper[4696]: I1202 22:47:08.007000 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 22:47:08 crc kubenswrapper[4696]: I1202 22:47:08.124832 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 22:47:08 crc kubenswrapper[4696]: I1202 22:47:08.283986 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 22:47:08 crc kubenswrapper[4696]: I1202 22:47:08.413184 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 22:47:08 crc kubenswrapper[4696]: I1202 22:47:08.725676 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.053733 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.088120 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.231353 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.330985 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.374908 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.584933 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.697248 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.839323 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 22:47:09 crc kubenswrapper[4696]: I1202 22:47:09.954809 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.085548 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.111115 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.181390 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.210001 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.230362 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.269718 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.390285 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.457460 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.466334 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.538518 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.562197 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.892635 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.903365 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.914229 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 22:47:10 crc kubenswrapper[4696]: I1202 22:47:10.948451 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.069725 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.140436 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.345347 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.478278 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.506180 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.516514 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.547206 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.581576 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.740384 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.806698 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.835688 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.846516 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.901839 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 22:47:11 crc kubenswrapper[4696]: I1202 22:47:11.907756 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.014055 4696 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.050603 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.087435 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.101961 4696 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.330121 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.372184 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.386545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.483051 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.496896 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.561861 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.598955 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.693482 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 22:47:12 crc kubenswrapper[4696]: I1202 22:47:12.907277 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.014772 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.103334 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.172409 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.273397 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.458987 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.480193 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.520013 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.623918 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.635890 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.684540 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.687859 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.825971 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 22:47:13 crc kubenswrapper[4696]: I1202 22:47:13.861607 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.123606 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.237334 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.334714 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.392482 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.403148 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.424024 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.435261 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.473979 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.486785 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.728579 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.869294 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 22:47:14 crc kubenswrapper[4696]: I1202 22:47:14.885554 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.032837 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.034038 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.076157 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.361987 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.376392 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.406282 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.517906 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.535792 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.754907 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.762256 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.771478 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.817459 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.845463 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.868188 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 22:47:15 crc kubenswrapper[4696]: I1202 22:47:15.971786 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.023812 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.128087 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.186495 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.260713 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.353393 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.429725 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.462604 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.463497 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.542428 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.596131 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.631514 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.665018 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.757084 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.785480 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 22:47:16 crc kubenswrapper[4696]: I1202 22:47:16.835107 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.095586 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.116620 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.223581 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.231867 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.255227 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.274088 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.281587 4696 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.469995 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.536562 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.699109 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.699853 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.813419 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 22:47:17 crc kubenswrapper[4696]: I1202 22:47:17.968074 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.124207 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.166614 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.189623 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.226632 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.276530 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.340138 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.406525 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.592406 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.654197 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.728858 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.743178 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.851914 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.899268 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.934628 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 22:47:18 crc kubenswrapper[4696]: I1202 22:47:18.984701 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.058552 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.086508 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.096713 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.179386 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.242232 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.244296 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.562321 4696 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.667773 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.787550 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 22:47:19 crc kubenswrapper[4696]: I1202 22:47:19.847634 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.030761 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.101077 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.276530 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.389545 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.398853 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.422400 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.610870 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.780293 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.869250 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 22:47:20 crc kubenswrapper[4696]: I1202 22:47:20.917501 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 22:47:21 crc kubenswrapper[4696]: I1202 22:47:21.032624 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 22:47:21 crc kubenswrapper[4696]: I1202 22:47:21.418497 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 22:47:22 crc kubenswrapper[4696]: I1202 22:47:22.322647 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 22:47:27 crc kubenswrapper[4696]: I1202 22:47:27.461728 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 22:47:27 crc kubenswrapper[4696]: I1202 22:47:27.493196 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 22:47:31 crc kubenswrapper[4696]: I1202 22:47:31.559920 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 22:47:32 crc kubenswrapper[4696]: I1202 22:47:32.006902 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 22:47:33 crc kubenswrapper[4696]: I1202 22:47:33.008499 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 22:47:33 crc kubenswrapper[4696]: I1202 22:47:33.217088 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 22:47:35 crc kubenswrapper[4696]: I1202 22:47:35.292350 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 22:47:35 crc kubenswrapper[4696]: I1202 22:47:35.767776 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 22:47:36 crc kubenswrapper[4696]: I1202 22:47:36.669472 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 22:47:37 crc kubenswrapper[4696]: I1202 22:47:37.552487 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 22:47:37 crc kubenswrapper[4696]: I1202 22:47:37.842213 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 22:47:37 crc kubenswrapper[4696]: I1202 22:47:37.894156 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 22:47:38 crc kubenswrapper[4696]: I1202 22:47:38.317387 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 22:47:38 crc kubenswrapper[4696]: I1202 22:47:38.490843 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 22:47:38 crc kubenswrapper[4696]: I1202 22:47:38.846928 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 22:47:39 crc kubenswrapper[4696]: I1202 22:47:39.078500 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 22:47:39 crc kubenswrapper[4696]: I1202 22:47:39.689785 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 22:47:40 crc kubenswrapper[4696]: I1202 22:47:40.536386 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 22:47:40 crc kubenswrapper[4696]: I1202 22:47:40.597155 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 22:47:40 crc kubenswrapper[4696]: I1202 22:47:40.947017 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 22:47:41 crc kubenswrapper[4696]: I1202 22:47:41.893631 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:47:42 crc kubenswrapper[4696]: I1202 22:47:42.135534 4696 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 22:47:42 crc kubenswrapper[4696]: I1202 22:47:42.164986 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 22:47:43 crc kubenswrapper[4696]: I1202 22:47:43.212475 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 22:47:43 crc kubenswrapper[4696]: I1202 22:47:43.712612 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 22:47:43 crc kubenswrapper[4696]: I1202 22:47:43.850656 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 22:47:43 crc kubenswrapper[4696]: I1202 22:47:43.876513 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.458288 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.579459 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.713031 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.734281 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.895585 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 22:47:44 crc kubenswrapper[4696]: I1202 22:47:44.940220 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 22:47:45 crc kubenswrapper[4696]: I1202 22:47:45.288525 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 22:47:45 crc kubenswrapper[4696]: I1202 22:47:45.364216 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 22:47:45 crc kubenswrapper[4696]: I1202 22:47:45.689308 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 22:47:45 crc kubenswrapper[4696]: I1202 22:47:45.850816 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 22:47:46 crc kubenswrapper[4696]: I1202 22:47:46.021213 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 22:47:46 crc kubenswrapper[4696]: I1202 22:47:46.203982 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 22:47:46 crc kubenswrapper[4696]: I1202 22:47:46.861442 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 22:47:46 crc kubenswrapper[4696]: I1202 22:47:46.883883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 22:47:47 crc kubenswrapper[4696]: I1202 22:47:47.328873 4696 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 22:47:47 crc kubenswrapper[4696]: I1202 22:47:47.843469 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 22:47:47 crc kubenswrapper[4696]: I1202 22:47:47.951294 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 22:47:47 crc kubenswrapper[4696]: I1202 22:47:47.959734 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 22:47:48 crc kubenswrapper[4696]: I1202 22:47:48.308285 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 22:47:49 crc kubenswrapper[4696]: I1202 22:47:49.046711 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 22:47:49 crc kubenswrapper[4696]: I1202 22:47:49.272733 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 22:47:49 crc kubenswrapper[4696]: I1202 22:47:49.609890 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 22:47:50 crc kubenswrapper[4696]: I1202 22:47:50.180222 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 22:47:50 crc kubenswrapper[4696]: I1202 22:47:50.306359 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 22:47:50 crc kubenswrapper[4696]: I1202 22:47:50.993629 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 22:47:51 crc kubenswrapper[4696]: I1202 22:47:51.012622 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 22:47:51 crc kubenswrapper[4696]: I1202 22:47:51.366450 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 22:47:51 crc kubenswrapper[4696]: I1202 22:47:51.533882 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 22:47:52 crc kubenswrapper[4696]: I1202 22:47:52.217496 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 22:47:53 crc kubenswrapper[4696]: I1202 22:47:53.145983 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 22:47:53 crc kubenswrapper[4696]: I1202 22:47:53.228294 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 22:47:54 crc kubenswrapper[4696]: I1202 22:47:54.714904 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 22:47:55 crc kubenswrapper[4696]: I1202 22:47:55.038407 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 22:47:58 crc kubenswrapper[4696]: I1202 22:47:58.871300 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.098457 4696 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.102038 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kf9wn" podStartSLOduration=80.016249039 podStartE2EDuration="2m58.102001081s" podCreationTimestamp="2025-12-02 22:45:01 +0000 UTC" firstStartedPulling="2025-12-02 22:45:07.25597284 +0000 UTC m=+170.136652841" lastFinishedPulling="2025-12-02 22:46:45.341724882 +0000 UTC m=+268.222404883" observedRunningTime="2025-12-02 22:46:55.486246787 +0000 UTC m=+278.366926798" watchObservedRunningTime="2025-12-02 22:47:59.102001081 +0000 UTC m=+341.982681122" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.103129 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rwnxp" podStartSLOduration=78.982553768 podStartE2EDuration="2m59.103115604s" podCreationTimestamp="2025-12-02 22:45:00 +0000 UTC" firstStartedPulling="2025-12-02 22:45:03.367686038 +0000 UTC m=+166.248366039" lastFinishedPulling="2025-12-02 22:46:43.488247864 +0000 UTC m=+266.368927875" observedRunningTime="2025-12-02 22:46:55.590715594 +0000 UTC m=+278.471395595" watchObservedRunningTime="2025-12-02 22:47:59.103115604 +0000 UTC m=+341.983795645" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.105053 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42w2d" podStartSLOduration=84.285110948 podStartE2EDuration="3m1.105043471s" podCreationTimestamp="2025-12-02 22:44:58 +0000 UTC" firstStartedPulling="2025-12-02 22:45:01.102732399 +0000 UTC m=+163.983412400" lastFinishedPulling="2025-12-02 22:46:37.922664922 +0000 UTC m=+260.803344923" observedRunningTime="2025-12-02 22:46:55.67132151 +0000 UTC m=+278.552001511" watchObservedRunningTime="2025-12-02 22:47:59.105043471 +0000 UTC m=+341.985723512" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.108685 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w765j" podStartSLOduration=78.997783336 podStartE2EDuration="2m57.108675078s" podCreationTimestamp="2025-12-02 22:45:02 +0000 UTC" firstStartedPulling="2025-12-02 22:45:07.256099634 +0000 UTC m=+170.136779635" lastFinishedPulling="2025-12-02 22:46:45.366991356 +0000 UTC m=+268.247671377" observedRunningTime="2025-12-02 22:46:55.612428002 +0000 UTC m=+278.493108003" watchObservedRunningTime="2025-12-02 22:47:59.108675078 +0000 UTC m=+341.989355119" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.109513 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-v6k2l","openshift-marketplace/certified-operators-xhnsb"] Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.109606 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-798f497965-q798w","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.110016 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110049 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.110074 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerName="oauth-openshift" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110091 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerName="oauth-openshift" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.110109 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="extract-utilities" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110123 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="extract-utilities" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.110162 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142e360e-5c5a-42af-b077-f75a807dea45" containerName="installer" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110177 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="142e360e-5c5a-42af-b077-f75a807dea45" containerName="installer" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.110202 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="extract-content" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110215 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="extract-content" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110462 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" containerName="oauth-openshift" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110487 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="142e360e-5c5a-42af-b077-f75a807dea45" containerName="installer" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110485 4696 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110532 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="158d80a5-8d90-46bb-83e7-ab3263f6d28f" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.110514 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" containerName="registry-server" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.111535 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp","openshift-marketplace/community-operators-42w2d","openshift-marketplace/redhat-operators-w765j"] Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.111858 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.112079 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w765j" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="registry-server" containerID="cri-o://fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2" gracePeriod=2 Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.113884 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42w2d" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" containerID="cri-o://f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" gracePeriod=2 Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.114411 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rwnxp" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="registry-server" containerID="cri-o://2c20a3dcbcfe62ef47620c53d267eaf34825c5ee0eeaf7e0415ce71886b4a585" gracePeriod=2 Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.118523 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.127663 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.129710 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.129855 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.129894 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.129984 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.130315 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.130392 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.132142 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.135484 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.138233 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.138436 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.140007 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.146538 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.150149 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.168042 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=1.168009153 podStartE2EDuration="1.168009153s" podCreationTimestamp="2025-12-02 22:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:47:59.158518463 +0000 UTC m=+342.039198504" watchObservedRunningTime="2025-12-02 22:47:59.168009153 +0000 UTC m=+342.048689194" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.175378 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.185890 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=60.185864198 podStartE2EDuration="1m0.185864198s" podCreationTimestamp="2025-12-02 22:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:47:59.182281172 +0000 UTC m=+342.062961183" watchObservedRunningTime="2025-12-02 22:47:59.185864198 +0000 UTC m=+342.066544209" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.205817 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-audit-policies\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.205924 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.205981 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206194 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206298 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206343 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206382 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57660974-4307-4c4b-a2de-20a42cd3f268-audit-dir\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206429 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206679 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206796 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206853 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.206992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.207044 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smq6n\" (UniqueName: \"kubernetes.io/projected/57660974-4307-4c4b-a2de-20a42cd3f268-kube-api-access-smq6n\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-audit-policies\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308848 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308877 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308930 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308950 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308968 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57660974-4307-4c4b-a2de-20a42cd3f268-audit-dir\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.308985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309007 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309026 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309050 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309119 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309138 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smq6n\" (UniqueName: \"kubernetes.io/projected/57660974-4307-4c4b-a2de-20a42cd3f268-kube-api-access-smq6n\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.309936 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-audit-policies\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.310468 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57660974-4307-4c4b-a2de-20a42cd3f268-audit-dir\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.311282 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.311602 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.311502 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.316191 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.316449 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.316653 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.316818 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c is running failed: container process not found" containerID="f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.317008 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.317052 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.317280 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.317432 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c is running failed: container process not found" containerID="f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.317730 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c is running failed: container process not found" containerID="f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:47:59 crc kubenswrapper[4696]: E1202 22:47:59.317788 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-42w2d" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.318998 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.320599 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57660974-4307-4c4b-a2de-20a42cd3f268-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.331655 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smq6n\" (UniqueName: \"kubernetes.io/projected/57660974-4307-4c4b-a2de-20a42cd3f268-kube-api-access-smq6n\") pod \"oauth-openshift-798f497965-q798w\" (UID: \"57660974-4307-4c4b-a2de-20a42cd3f268\") " pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.438663 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b259842-7ad5-413d-a13c-73c0931b6527" path="/var/lib/kubelet/pods/4b259842-7ad5-413d-a13c-73c0931b6527/volumes" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.439617 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e854b6-0bab-4aa3-9d60-97542cd304eb" path="/var/lib/kubelet/pods/93e854b6-0bab-4aa3-9d60-97542cd304eb/volumes" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.470235 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:47:59 crc kubenswrapper[4696]: I1202 22:47:59.747205 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-798f497965-q798w"] Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.191187 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.208649 4696 generic.go:334] "Generic (PLEG): container finished" podID="4eb44562-c608-428a-8812-428de40cbcde" containerID="f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" exitCode=0 Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.208747 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerDied","Data":"f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.222041 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities\") pod \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.222226 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content\") pod \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.222261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq2cv\" (UniqueName: \"kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv\") pod \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\" (UID: \"2b3474b0-8824-4cb5-8ef3-c459af98ed02\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.223167 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities" (OuterVolumeSpecName: "utilities") pod "2b3474b0-8824-4cb5-8ef3-c459af98ed02" (UID: "2b3474b0-8824-4cb5-8ef3-c459af98ed02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.228323 4696 generic.go:334] "Generic (PLEG): container finished" podID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerID="fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2" exitCode=0 Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.228413 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerDied","Data":"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.228444 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w765j" event={"ID":"2b3474b0-8824-4cb5-8ef3-c459af98ed02","Type":"ContainerDied","Data":"a4ba8d9163480ed6b9c9feb5630b7d8092b46a5fae8fba759134ca2666e54399"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.228463 4696 scope.go:117] "RemoveContainer" containerID="fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.229218 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w765j" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.229575 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.235056 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798f497965-q798w" event={"ID":"57660974-4307-4c4b-a2de-20a42cd3f268","Type":"ContainerStarted","Data":"ad002e7c5b53d8aa8f10476abeb01861888e4ab369d339c2844dc880344fffd9"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.235102 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798f497965-q798w" event={"ID":"57660974-4307-4c4b-a2de-20a42cd3f268","Type":"ContainerStarted","Data":"ed028de0ba4ff24a5c25f19a5dfe2d81f4b7823aa79b0bdbd4621c2ae3331327"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.237427 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.243301 4696 generic.go:334] "Generic (PLEG): container finished" podID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerID="2c20a3dcbcfe62ef47620c53d267eaf34825c5ee0eeaf7e0415ce71886b4a585" exitCode=0 Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.243358 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwnxp" event={"ID":"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b","Type":"ContainerDied","Data":"2c20a3dcbcfe62ef47620c53d267eaf34825c5ee0eeaf7e0415ce71886b4a585"} Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.243459 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwnxp" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.245804 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv" (OuterVolumeSpecName: "kube-api-access-gq2cv") pod "2b3474b0-8824-4cb5-8ef3-c459af98ed02" (UID: "2b3474b0-8824-4cb5-8ef3-c459af98ed02"). InnerVolumeSpecName "kube-api-access-gq2cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.278314 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-798f497965-q798w" podStartSLOduration=90.278283543 podStartE2EDuration="1m30.278283543s" podCreationTimestamp="2025-12-02 22:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:48:00.275221863 +0000 UTC m=+343.155901864" watchObservedRunningTime="2025-12-02 22:48:00.278283543 +0000 UTC m=+343.158963544" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.290238 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.293690 4696 scope.go:117] "RemoveContainer" containerID="4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.318691 4696 scope.go:117] "RemoveContainer" containerID="ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323393 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content\") pod \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323503 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjg6v\" (UniqueName: \"kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v\") pod \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323581 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities\") pod \"4eb44562-c608-428a-8812-428de40cbcde\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323630 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content\") pod \"4eb44562-c608-428a-8812-428de40cbcde\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323690 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities\") pod \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\" (UID: \"a04370b2-4b6e-4eac-9ffe-4f2e7116b03b\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.323727 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvpk\" (UniqueName: \"kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk\") pod \"4eb44562-c608-428a-8812-428de40cbcde\" (UID: \"4eb44562-c608-428a-8812-428de40cbcde\") " Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.324325 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq2cv\" (UniqueName: \"kubernetes.io/projected/2b3474b0-8824-4cb5-8ef3-c459af98ed02-kube-api-access-gq2cv\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.324359 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.326429 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities" (OuterVolumeSpecName: "utilities") pod "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" (UID: "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.327012 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities" (OuterVolumeSpecName: "utilities") pod "4eb44562-c608-428a-8812-428de40cbcde" (UID: "4eb44562-c608-428a-8812-428de40cbcde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.327235 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v" (OuterVolumeSpecName: "kube-api-access-pjg6v") pod "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" (UID: "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b"). InnerVolumeSpecName "kube-api-access-pjg6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.331632 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk" (OuterVolumeSpecName: "kube-api-access-fmvpk") pod "4eb44562-c608-428a-8812-428de40cbcde" (UID: "4eb44562-c608-428a-8812-428de40cbcde"). InnerVolumeSpecName "kube-api-access-fmvpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.349790 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" (UID: "a04370b2-4b6e-4eac-9ffe-4f2e7116b03b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.364434 4696 scope.go:117] "RemoveContainer" containerID="fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2" Dec 02 22:48:00 crc kubenswrapper[4696]: E1202 22:48:00.365067 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2\": container with ID starting with fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2 not found: ID does not exist" containerID="fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365107 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2"} err="failed to get container status \"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2\": rpc error: code = NotFound desc = could not find container \"fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2\": container with ID starting with fa734de885825991e09dfacf903386b718bc853eb4143741612552807fff6ad2 not found: ID does not exist" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365136 4696 scope.go:117] "RemoveContainer" containerID="4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4" Dec 02 22:48:00 crc kubenswrapper[4696]: E1202 22:48:00.365469 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4\": container with ID starting with 4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4 not found: ID does not exist" containerID="4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365525 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4"} err="failed to get container status \"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4\": rpc error: code = NotFound desc = could not find container \"4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4\": container with ID starting with 4682bb317b4efe562916ddffb565817f84219362984a7671695719184cd68de4 not found: ID does not exist" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365562 4696 scope.go:117] "RemoveContainer" containerID="ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf" Dec 02 22:48:00 crc kubenswrapper[4696]: E1202 22:48:00.365906 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf\": container with ID starting with ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf not found: ID does not exist" containerID="ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365933 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf"} err="failed to get container status \"ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf\": rpc error: code = NotFound desc = could not find container \"ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf\": container with ID starting with ceb50e322ec1ada283024c62c387161b68483a2926fdc0fd659bc03917f6c3bf not found: ID does not exist" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.365950 4696 scope.go:117] "RemoveContainer" containerID="2c20a3dcbcfe62ef47620c53d267eaf34825c5ee0eeaf7e0415ce71886b4a585" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.373270 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b3474b0-8824-4cb5-8ef3-c459af98ed02" (UID: "2b3474b0-8824-4cb5-8ef3-c459af98ed02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.379909 4696 scope.go:117] "RemoveContainer" containerID="fd9405589caab0e36654d391421562c91e178f1d4eedf741b443102f64c75725" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.389568 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eb44562-c608-428a-8812-428de40cbcde" (UID: "4eb44562-c608-428a-8812-428de40cbcde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.395969 4696 scope.go:117] "RemoveContainer" containerID="cd8f53b1bc280a95b7a1194ce002e99b0fa58ba6c09cd44516167db9ad2c01e1" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425502 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425546 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvpk\" (UniqueName: \"kubernetes.io/projected/4eb44562-c608-428a-8812-428de40cbcde-kube-api-access-fmvpk\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425576 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425587 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjg6v\" (UniqueName: \"kubernetes.io/projected/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b-kube-api-access-pjg6v\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425599 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3474b0-8824-4cb5-8ef3-c459af98ed02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425611 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.425620 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44562-c608-428a-8812-428de40cbcde-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.565576 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w765j"] Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.569653 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w765j"] Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.587262 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp"] Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.593870 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwnxp"] Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.729011 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 22:48:00 crc kubenswrapper[4696]: I1202 22:48:00.801800 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-798f497965-q798w" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.027793 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.256656 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42w2d" event={"ID":"4eb44562-c608-428a-8812-428de40cbcde","Type":"ContainerDied","Data":"4e4904b82229c82ddbb633cad2d4bf3dd9db7a2e072421c491b1db4d39adc0c9"} Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.257015 4696 scope.go:117] "RemoveContainer" containerID="f81090f4798d9d290574104f77fffb9daf1941b982ee7382a50e74f9a09aa61c" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.257215 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42w2d" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.287661 4696 scope.go:117] "RemoveContainer" containerID="e180a0236e0fd3d800deac73a4b68e072155b1bc642b53b9b135e6ebdd41ebd5" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.296669 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42w2d"] Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.300501 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42w2d"] Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.328046 4696 scope.go:117] "RemoveContainer" containerID="e28fef83921bba493531f6df229961eb19d003df4d89ef0df0de8d6cc1b8a521" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.438776 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" path="/var/lib/kubelet/pods/2b3474b0-8824-4cb5-8ef3-c459af98ed02/volumes" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.439632 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb44562-c608-428a-8812-428de40cbcde" path="/var/lib/kubelet/pods/4eb44562-c608-428a-8812-428de40cbcde/volumes" Dec 02 22:48:01 crc kubenswrapper[4696]: I1202 22:48:01.440253 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" path="/var/lib/kubelet/pods/a04370b2-4b6e-4eac-9ffe-4f2e7116b03b/volumes" Dec 02 22:48:05 crc kubenswrapper[4696]: I1202 22:48:05.147582 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:48:05 crc kubenswrapper[4696]: I1202 22:48:05.148508 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" podUID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" containerName="controller-manager" containerID="cri-o://5c5d2571cbf6118626ac9e73d31b8eb8974f0ebfb36b72119faace2881a0e442" gracePeriod=30 Dec 02 22:48:05 crc kubenswrapper[4696]: I1202 22:48:05.239869 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:48:05 crc kubenswrapper[4696]: I1202 22:48:05.240131 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" podUID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" containerName="route-controller-manager" containerID="cri-o://690c855a032ba55ea7b7e1f0d54f787b71f42f38d5b836500381207da8fbf237" gracePeriod=30 Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.295508 4696 generic.go:334] "Generic (PLEG): container finished" podID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" containerID="5c5d2571cbf6118626ac9e73d31b8eb8974f0ebfb36b72119faace2881a0e442" exitCode=0 Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.295624 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" event={"ID":"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd","Type":"ContainerDied","Data":"5c5d2571cbf6118626ac9e73d31b8eb8974f0ebfb36b72119faace2881a0e442"} Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.298567 4696 generic.go:334] "Generic (PLEG): container finished" podID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" containerID="690c855a032ba55ea7b7e1f0d54f787b71f42f38d5b836500381207da8fbf237" exitCode=0 Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.298621 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" event={"ID":"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2","Type":"ContainerDied","Data":"690c855a032ba55ea7b7e1f0d54f787b71f42f38d5b836500381207da8fbf237"} Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.373684 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.423956 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert\") pod \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.424044 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfwtq\" (UniqueName: \"kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq\") pod \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.424076 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles\") pod \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.424105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config\") pod \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.424138 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca\") pod \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\" (UID: \"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.425398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" (UID: "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.425434 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config" (OuterVolumeSpecName: "config") pod "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" (UID: "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.425410 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" (UID: "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.436804 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" (UID: "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.437192 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq" (OuterVolumeSpecName: "kube-api-access-rfwtq") pod "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" (UID: "8ef3caa5-75cd-444b-aa84-9116ea9ce1cd"). InnerVolumeSpecName "kube-api-access-rfwtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.526579 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.526613 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfwtq\" (UniqueName: \"kubernetes.io/projected/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-kube-api-access-rfwtq\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.526631 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.526645 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.526656 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.535222 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.627279 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config\") pod \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.627379 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmm4\" (UniqueName: \"kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4\") pod \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.627410 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert\") pod \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.627451 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca\") pod \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\" (UID: \"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2\") " Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.628558 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" (UID: "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.628600 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config" (OuterVolumeSpecName: "config") pod "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" (UID: "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.633112 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" (UID: "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.634190 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4" (OuterVolumeSpecName: "kube-api-access-2wmm4") pod "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" (UID: "2c9a9ec1-4fa9-4d41-84e2-08a822f546c2"). InnerVolumeSpecName "kube-api-access-2wmm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.729103 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wmm4\" (UniqueName: \"kubernetes.io/projected/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-kube-api-access-2wmm4\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.729160 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.729178 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.729195 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.938886 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69"] Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939438 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939470 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939504 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939522 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939556 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939577 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939624 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" containerName="route-controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939644 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" containerName="route-controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939675 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939691 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939707 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939723 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939774 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939808 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939838 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" containerName="controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939856 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" containerName="controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939875 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939893 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939920 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939937 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="extract-utilities" Dec 02 22:48:06 crc kubenswrapper[4696]: E1202 22:48:06.939978 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.939994 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="extract-content" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.940252 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb44562-c608-428a-8812-428de40cbcde" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.940281 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3474b0-8824-4cb5-8ef3-c459af98ed02" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.940328 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" containerName="route-controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.940354 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" containerName="controller-manager" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.940379 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04370b2-4b6e-4eac-9ffe-4f2e7116b03b" containerName="registry-server" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.941479 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:06 crc kubenswrapper[4696]: I1202 22:48:06.979833 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.034062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqrh\" (UniqueName: \"kubernetes.io/projected/460d4318-07d3-4949-b542-3d4f8d53241c-kube-api-access-ncqrh\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.034140 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-client-ca\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.034334 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-config\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.034422 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460d4318-07d3-4949-b542-3d4f8d53241c-serving-cert\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.135685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqrh\" (UniqueName: \"kubernetes.io/projected/460d4318-07d3-4949-b542-3d4f8d53241c-kube-api-access-ncqrh\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.135799 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-client-ca\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.135879 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-config\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.135912 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460d4318-07d3-4949-b542-3d4f8d53241c-serving-cert\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.137922 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-client-ca\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.138496 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460d4318-07d3-4949-b542-3d4f8d53241c-config\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.142633 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460d4318-07d3-4949-b542-3d4f8d53241c-serving-cert\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.161623 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqrh\" (UniqueName: \"kubernetes.io/projected/460d4318-07d3-4949-b542-3d4f8d53241c-kube-api-access-ncqrh\") pod \"route-controller-manager-6445d45f8-frc69\" (UID: \"460d4318-07d3-4949-b542-3d4f8d53241c\") " pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.273099 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.308185 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.308235 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p44hz" event={"ID":"8ef3caa5-75cd-444b-aa84-9116ea9ce1cd","Type":"ContainerDied","Data":"1d776ec15daed1cd0442bd7aa55aa6096bbcc314c3e60ce5e0bdc4f985470464"} Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.308365 4696 scope.go:117] "RemoveContainer" containerID="5c5d2571cbf6118626ac9e73d31b8eb8974f0ebfb36b72119faace2881a0e442" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.314962 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.318003 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v" event={"ID":"2c9a9ec1-4fa9-4d41-84e2-08a822f546c2","Type":"ContainerDied","Data":"701323f2e949f39f8cec2f8adec773d55414c1c1d863d4da1224add9cce92b9a"} Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.345295 4696 scope.go:117] "RemoveContainer" containerID="690c855a032ba55ea7b7e1f0d54f787b71f42f38d5b836500381207da8fbf237" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.352484 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.355486 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p44hz"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.394249 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.399486 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ph95v"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.443040 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9a9ec1-4fa9-4d41-84e2-08a822f546c2" path="/var/lib/kubelet/pods/2c9a9ec1-4fa9-4d41-84e2-08a822f546c2/volumes" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.443970 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef3caa5-75cd-444b-aa84-9116ea9ce1cd" path="/var/lib/kubelet/pods/8ef3caa5-75cd-444b-aa84-9116ea9ce1cd/volumes" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.444395 4696 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.444640 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da" gracePeriod=5 Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.539146 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:48:07 crc kubenswrapper[4696]: E1202 22:48:07.539431 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.539456 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.539572 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.540065 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.543401 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.544088 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.544183 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.544430 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.544645 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.545728 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.545943 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.545943 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.554083 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.646977 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.647024 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.647061 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.647261 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96qt2\" (UniqueName: \"kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.647316 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.749374 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.749443 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.749483 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.749522 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96qt2\" (UniqueName: \"kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.749545 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.750597 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.750665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.751176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.766961 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.776160 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96qt2\" (UniqueName: \"kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2\") pod \"controller-manager-d48c458cb-mhg55\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:07 crc kubenswrapper[4696]: I1202 22:48:07.874987 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.324467 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" event={"ID":"460d4318-07d3-4949-b542-3d4f8d53241c","Type":"ContainerStarted","Data":"f4baaa8846a576411f59b95aafb211245abb4e697eedf31005bfd64c412fab85"} Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.325078 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.325096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" event={"ID":"460d4318-07d3-4949-b542-3d4f8d53241c","Type":"ContainerStarted","Data":"813e810bb5d549c1028527ab7c83701f62df83ad1fabf9285a9b74f6cfa742b1"} Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.332222 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.357040 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6445d45f8-frc69" podStartSLOduration=2.357014564 podStartE2EDuration="2.357014564s" podCreationTimestamp="2025-12-02 22:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:48:08.356273632 +0000 UTC m=+351.236953633" watchObservedRunningTime="2025-12-02 22:48:08.357014564 +0000 UTC m=+351.237694565" Dec 02 22:48:08 crc kubenswrapper[4696]: I1202 22:48:08.385097 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:48:09 crc kubenswrapper[4696]: I1202 22:48:09.354716 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" event={"ID":"b60ca874-ef00-4e4a-8b2e-26a1581d4c86","Type":"ContainerStarted","Data":"d093b727e4acd202062f725755d73a84976d16c248b9fa6208a19a9a8ff5741d"} Dec 02 22:48:09 crc kubenswrapper[4696]: I1202 22:48:09.355451 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:09 crc kubenswrapper[4696]: I1202 22:48:09.355476 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" event={"ID":"b60ca874-ef00-4e4a-8b2e-26a1581d4c86","Type":"ContainerStarted","Data":"dc16537831c631e17f6ee0e9b7d331f95fd2e8342c6a1d195a2da2c90c7b9639"} Dec 02 22:48:09 crc kubenswrapper[4696]: I1202 22:48:09.361411 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:48:09 crc kubenswrapper[4696]: I1202 22:48:09.410482 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" podStartSLOduration=4.410440412 podStartE2EDuration="4.410440412s" podCreationTimestamp="2025-12-02 22:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:48:09.381353907 +0000 UTC m=+352.262033958" watchObservedRunningTime="2025-12-02 22:48:09.410440412 +0000 UTC m=+352.291120453" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.650323 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.651170 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9rp8k" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="registry-server" containerID="cri-o://e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935" gracePeriod=30 Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.659084 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.659358 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jbfs" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="registry-server" containerID="cri-o://3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68" gracePeriod=30 Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.672265 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.672588 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" containerID="cri-o://d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812" gracePeriod=30 Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.708342 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.709235 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6ccj5" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="registry-server" containerID="cri-o://a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2" gracePeriod=30 Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.725183 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.725626 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kf9wn" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="registry-server" containerID="cri-o://3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" gracePeriod=30 Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.740587 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fqzv2"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.742887 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.745369 4696 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dtrjk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.745415 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.746506 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fqzv2"] Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.821635 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.822215 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.822278 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjp2\" (UniqueName: \"kubernetes.io/projected/09917c11-8312-4f5a-9597-ad0570d0aeb0-kube-api-access-qkjp2\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.924022 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.924078 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.924125 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjp2\" (UniqueName: \"kubernetes.io/projected/09917c11-8312-4f5a-9597-ad0570d0aeb0-kube-api-access-qkjp2\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.926143 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.932531 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09917c11-8312-4f5a-9597-ad0570d0aeb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:11 crc kubenswrapper[4696]: I1202 22:48:11.942122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjp2\" (UniqueName: \"kubernetes.io/projected/09917c11-8312-4f5a-9597-ad0570d0aeb0-kube-api-access-qkjp2\") pod \"marketplace-operator-79b997595-fqzv2\" (UID: \"09917c11-8312-4f5a-9597-ad0570d0aeb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.149804 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.155916 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.194170 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f is running failed: container process not found" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.194557 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f is running failed: container process not found" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.194849 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f is running failed: container process not found" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.194880 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-kf9wn" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="registry-server" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.236096 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities\") pod \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.236183 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fckxn\" (UniqueName: \"kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn\") pod \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.236233 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content\") pod \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\" (UID: \"aeaa03bb-5cf7-4a89-b182-36358f9e247c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.237920 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities" (OuterVolumeSpecName: "utilities") pod "aeaa03bb-5cf7-4a89-b182-36358f9e247c" (UID: "aeaa03bb-5cf7-4a89-b182-36358f9e247c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.242229 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn" (OuterVolumeSpecName: "kube-api-access-fckxn") pod "aeaa03bb-5cf7-4a89-b182-36358f9e247c" (UID: "aeaa03bb-5cf7-4a89-b182-36358f9e247c"). InnerVolumeSpecName "kube-api-access-fckxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.252200 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.297072 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.298301 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeaa03bb-5cf7-4a89-b182-36358f9e247c" (UID: "aeaa03bb-5cf7-4a89-b182-36358f9e247c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.301013 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.304389 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.336859 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics\") pod \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.337033 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gkqm\" (UniqueName: \"kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm\") pod \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.337062 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca\") pod \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\" (UID: \"db82b3f2-cd94-42ef-a662-8a4b4c8fac85\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.337285 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.337300 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fckxn\" (UniqueName: \"kubernetes.io/projected/aeaa03bb-5cf7-4a89-b182-36358f9e247c-kube-api-access-fckxn\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.337309 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaa03bb-5cf7-4a89-b182-36358f9e247c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.338103 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "db82b3f2-cd94-42ef-a662-8a4b4c8fac85" (UID: "db82b3f2-cd94-42ef-a662-8a4b4c8fac85"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.344470 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm" (OuterVolumeSpecName: "kube-api-access-8gkqm") pod "db82b3f2-cd94-42ef-a662-8a4b4c8fac85" (UID: "db82b3f2-cd94-42ef-a662-8a4b4c8fac85"). InnerVolumeSpecName "kube-api-access-8gkqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.345202 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "db82b3f2-cd94-42ef-a662-8a4b4c8fac85" (UID: "db82b3f2-cd94-42ef-a662-8a4b4c8fac85"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.384341 4696 generic.go:334] "Generic (PLEG): container finished" podID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerID="3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68" exitCode=0 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.384429 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerDied","Data":"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.384469 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jbfs" event={"ID":"aeaa03bb-5cf7-4a89-b182-36358f9e247c","Type":"ContainerDied","Data":"d6bc612d5ace68919cac0c789e069bdeef5ad9824d50aa2d42f68d88b202d6b5"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.384499 4696 scope.go:117] "RemoveContainer" containerID="3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.384619 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jbfs" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.392709 4696 generic.go:334] "Generic (PLEG): container finished" podID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerID="e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935" exitCode=0 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.392826 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerDied","Data":"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.392866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9rp8k" event={"ID":"ca4122e2-7532-4ca9-a111-9097cfae1dde","Type":"ContainerDied","Data":"4d004fcb5320aeceac48622d766b6dc2fd6375afd63fccaa41a693642ec8c5f6"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.392945 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9rp8k" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.400529 4696 generic.go:334] "Generic (PLEG): container finished" podID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerID="d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812" exitCode=0 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.400646 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" event={"ID":"db82b3f2-cd94-42ef-a662-8a4b4c8fac85","Type":"ContainerDied","Data":"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.400794 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.405951 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dtrjk" event={"ID":"db82b3f2-cd94-42ef-a662-8a4b4c8fac85","Type":"ContainerDied","Data":"12099e6fd7c79da47a5cdbe285bb618c85b43504b6c2e2eaf3bda96deae11f2e"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.408227 4696 scope.go:117] "RemoveContainer" containerID="39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.425643 4696 generic.go:334] "Generic (PLEG): container finished" podID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" exitCode=0 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.425819 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerDied","Data":"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.425859 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf9wn" event={"ID":"e2e0ae4b-0dce-4a74-8d3c-05635e86392b","Type":"ContainerDied","Data":"7dacd0610bf08a05e5954fdfe41064852a1012fd08b0cf513909ca664d658b7a"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.425931 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf9wn" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.432179 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerID="a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2" exitCode=0 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.432323 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerDied","Data":"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.432395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ccj5" event={"ID":"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c","Type":"ContainerDied","Data":"1e66bc67dae22288e742756e26a2cecee8a1b44cc9ed22bba6df5e39088e6e5e"} Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.432580 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ccj5" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440029 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvxrp\" (UniqueName: \"kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp\") pod \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities\") pod \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440139 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content\") pod \"ca4122e2-7532-4ca9-a111-9097cfae1dde\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440193 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content\") pod \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440216 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kzxv\" (UniqueName: \"kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv\") pod \"ca4122e2-7532-4ca9-a111-9097cfae1dde\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440354 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities\") pod \"ca4122e2-7532-4ca9-a111-9097cfae1dde\" (UID: \"ca4122e2-7532-4ca9-a111-9097cfae1dde\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440507 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz2mv\" (UniqueName: \"kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv\") pod \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\" (UID: \"e2e0ae4b-0dce-4a74-8d3c-05635e86392b\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440559 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content\") pod \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440596 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities\") pod \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\" (UID: \"ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440907 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440931 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gkqm\" (UniqueName: \"kubernetes.io/projected/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-kube-api-access-8gkqm\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.440947 4696 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db82b3f2-cd94-42ef-a662-8a4b4c8fac85-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.442225 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities" (OuterVolumeSpecName: "utilities") pod "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" (UID: "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.442713 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp" (OuterVolumeSpecName: "kube-api-access-vvxrp") pod "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" (UID: "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c"). InnerVolumeSpecName "kube-api-access-vvxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.442819 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities" (OuterVolumeSpecName: "utilities") pod "e2e0ae4b-0dce-4a74-8d3c-05635e86392b" (UID: "e2e0ae4b-0dce-4a74-8d3c-05635e86392b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.443206 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities" (OuterVolumeSpecName: "utilities") pod "ca4122e2-7532-4ca9-a111-9097cfae1dde" (UID: "ca4122e2-7532-4ca9-a111-9097cfae1dde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.444533 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:48:12 crc kubenswrapper[4696]: W1202 22:48:12.450269 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09917c11_8312_4f5a_9597_ad0570d0aeb0.slice/crio-d45b70c8e5cc5a92ae0a424ebcc0b11567e1039e56fe668006bc223ea3e652b1 WatchSource:0}: Error finding container d45b70c8e5cc5a92ae0a424ebcc0b11567e1039e56fe668006bc223ea3e652b1: Status 404 returned error can't find the container with id d45b70c8e5cc5a92ae0a424ebcc0b11567e1039e56fe668006bc223ea3e652b1 Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.453881 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv" (OuterVolumeSpecName: "kube-api-access-jz2mv") pod "e2e0ae4b-0dce-4a74-8d3c-05635e86392b" (UID: "e2e0ae4b-0dce-4a74-8d3c-05635e86392b"). InnerVolumeSpecName "kube-api-access-jz2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.456998 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jbfs"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.460674 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fqzv2"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.462678 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv" (OuterVolumeSpecName: "kube-api-access-7kzxv") pod "ca4122e2-7532-4ca9-a111-9097cfae1dde" (UID: "ca4122e2-7532-4ca9-a111-9097cfae1dde"). InnerVolumeSpecName "kube-api-access-7kzxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.462927 4696 scope.go:117] "RemoveContainer" containerID="dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.463803 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" (UID: "ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.464008 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.466906 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dtrjk"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.490562 4696 scope.go:117] "RemoveContainer" containerID="3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.494918 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68\": container with ID starting with 3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68 not found: ID does not exist" containerID="3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.494981 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68"} err="failed to get container status \"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68\": rpc error: code = NotFound desc = could not find container \"3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68\": container with ID starting with 3fc9c70881ebfb4d6dfea4a6977a0c29064b36bb1a7296a448ed8b62a3842a68 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.495017 4696 scope.go:117] "RemoveContainer" containerID="39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.497902 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da\": container with ID starting with 39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da not found: ID does not exist" containerID="39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.497973 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da"} err="failed to get container status \"39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da\": rpc error: code = NotFound desc = could not find container \"39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da\": container with ID starting with 39dbeb0169e0a28958240580b82f39ad1f0e29209e5674a4a582ef1526a341da not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.498020 4696 scope.go:117] "RemoveContainer" containerID="dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.498435 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77\": container with ID starting with dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77 not found: ID does not exist" containerID="dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.498656 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77"} err="failed to get container status \"dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77\": rpc error: code = NotFound desc = could not find container \"dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77\": container with ID starting with dd8851fd25340e2c9f6e441a8a55782e327c78628d5b76c527c76f1f04f2ee77 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.498789 4696 scope.go:117] "RemoveContainer" containerID="e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.508059 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca4122e2-7532-4ca9-a111-9097cfae1dde" (UID: "ca4122e2-7532-4ca9-a111-9097cfae1dde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543105 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz2mv\" (UniqueName: \"kubernetes.io/projected/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-kube-api-access-jz2mv\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543507 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543554 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543575 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvxrp\" (UniqueName: \"kubernetes.io/projected/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c-kube-api-access-vvxrp\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543593 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543607 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543620 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kzxv\" (UniqueName: \"kubernetes.io/projected/ca4122e2-7532-4ca9-a111-9097cfae1dde-kube-api-access-7kzxv\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.543633 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4122e2-7532-4ca9-a111-9097cfae1dde-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.560422 4696 scope.go:117] "RemoveContainer" containerID="d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.579520 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.579618 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.597682 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2e0ae4b-0dce-4a74-8d3c-05635e86392b" (UID: "e2e0ae4b-0dce-4a74-8d3c-05635e86392b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.600186 4696 scope.go:117] "RemoveContainer" containerID="5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.623347 4696 scope.go:117] "RemoveContainer" containerID="e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.624065 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935\": container with ID starting with e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935 not found: ID does not exist" containerID="e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.624117 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935"} err="failed to get container status \"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935\": rpc error: code = NotFound desc = could not find container \"e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935\": container with ID starting with e0f5091a78ab6c1174ae62be502eff83e913778c3b89ecca13a8c145ee5fb935 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.624155 4696 scope.go:117] "RemoveContainer" containerID="d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.624774 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f\": container with ID starting with d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f not found: ID does not exist" containerID="d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.624899 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f"} err="failed to get container status \"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f\": rpc error: code = NotFound desc = could not find container \"d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f\": container with ID starting with d4e1e65cdcf7d8d317fed30c24a065b7b4bc93294a70b43e99842b3296d9578f not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.625006 4696 scope.go:117] "RemoveContainer" containerID="5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.625620 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec\": container with ID starting with 5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec not found: ID does not exist" containerID="5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.625681 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec"} err="failed to get container status \"5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec\": rpc error: code = NotFound desc = could not find container \"5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec\": container with ID starting with 5dd2f7a25400fe51f3a9f7177f5e92fa558ef0e1d272aca5b69bbccc51f614ec not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.625708 4696 scope.go:117] "RemoveContainer" containerID="d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650117 4696 scope.go:117] "RemoveContainer" containerID="d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650507 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650616 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.650634 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812\": container with ID starting with d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812 not found: ID does not exist" containerID="d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650660 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650685 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650677 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812"} err="failed to get container status \"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812\": rpc error: code = NotFound desc = could not find container \"d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812\": container with ID starting with d82a972304308e4f9d55201e1859c3750e48880e70bdff16ec8f4aa7e92a6812 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650730 4696 scope.go:117] "RemoveContainer" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650799 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.650960 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651015 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651037 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651110 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651177 4696 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651193 4696 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651207 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e0ae4b-0dce-4a74-8d3c-05635e86392b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.651219 4696 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.662430 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.672140 4696 scope.go:117] "RemoveContainer" containerID="62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.696895 4696 scope.go:117] "RemoveContainer" containerID="3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.713114 4696 scope.go:117] "RemoveContainer" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.714069 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f\": container with ID starting with 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f not found: ID does not exist" containerID="3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714117 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f"} err="failed to get container status \"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f\": rpc error: code = NotFound desc = could not find container \"3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f\": container with ID starting with 3f0a2a5970794e019722167e07207f98b22a1820a9257f220a1b273043dd936f not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714154 4696 scope.go:117] "RemoveContainer" containerID="62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.714513 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a\": container with ID starting with 62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a not found: ID does not exist" containerID="62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714559 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a"} err="failed to get container status \"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a\": rpc error: code = NotFound desc = could not find container \"62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a\": container with ID starting with 62be061f45d5db9cf4d7f74440ee4d09c088c74298e8482d8d66d4e35b4ccb9a not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714589 4696 scope.go:117] "RemoveContainer" containerID="3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.714864 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4\": container with ID starting with 3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4 not found: ID does not exist" containerID="3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714890 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4"} err="failed to get container status \"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4\": rpc error: code = NotFound desc = could not find container \"3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4\": container with ID starting with 3fb805a4b3514f1ca6295aa79978c1e766c581f00ee26ef52f3acb4951b23ed4 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.714907 4696 scope.go:117] "RemoveContainer" containerID="a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.725334 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.728220 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9rp8k"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.738174 4696 scope.go:117] "RemoveContainer" containerID="443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.752587 4696 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.752632 4696 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.757427 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.765226 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kf9wn"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.771183 4696 scope.go:117] "RemoveContainer" containerID="c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.795225 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.799411 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ccj5"] Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.821112 4696 scope.go:117] "RemoveContainer" containerID="a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.821711 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2\": container with ID starting with a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2 not found: ID does not exist" containerID="a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.821783 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2"} err="failed to get container status \"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2\": rpc error: code = NotFound desc = could not find container \"a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2\": container with ID starting with a850e3e799ff54f9b067984e3570cfd3e0a980ba14b8ed80fdccb78cb53fd3e2 not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.821822 4696 scope.go:117] "RemoveContainer" containerID="443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.822439 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc\": container with ID starting with 443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc not found: ID does not exist" containerID="443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.822506 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc"} err="failed to get container status \"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc\": rpc error: code = NotFound desc = could not find container \"443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc\": container with ID starting with 443573975d1347ecfe45924171eee0a61f1ea9a16ec4fbf869fb3c51d0d942bc not found: ID does not exist" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.822560 4696 scope.go:117] "RemoveContainer" containerID="c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166" Dec 02 22:48:12 crc kubenswrapper[4696]: E1202 22:48:12.823218 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166\": container with ID starting with c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166 not found: ID does not exist" containerID="c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166" Dec 02 22:48:12 crc kubenswrapper[4696]: I1202 22:48:12.823249 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166"} err="failed to get container status \"c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166\": rpc error: code = NotFound desc = could not find container \"c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166\": container with ID starting with c77ee686adba80ffdb5bd177dcec1bc427f2e187f8c254a50a7e33892e717166 not found: ID does not exist" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.456126 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" path="/var/lib/kubelet/pods/ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.457621 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" path="/var/lib/kubelet/pods/aeaa03bb-5cf7-4a89-b182-36358f9e247c/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.458591 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" path="/var/lib/kubelet/pods/ca4122e2-7532-4ca9-a111-9097cfae1dde/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.462414 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" path="/var/lib/kubelet/pods/db82b3f2-cd94-42ef-a662-8a4b4c8fac85/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.463667 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" path="/var/lib/kubelet/pods/e2e0ae4b-0dce-4a74-8d3c-05635e86392b/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.464512 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.465187 4696 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.468109 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.468237 4696 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da" exitCode=137 Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.468530 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.485623 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.485755 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.485780 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" event={"ID":"09917c11-8312-4f5a-9597-ad0570d0aeb0","Type":"ContainerStarted","Data":"873003eab999f3c87d438830324a9d7c97512fbc47945300a3a10e75b3effe1c"} Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.485812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" event={"ID":"09917c11-8312-4f5a-9597-ad0570d0aeb0","Type":"ContainerStarted","Data":"d45b70c8e5cc5a92ae0a424ebcc0b11567e1039e56fe668006bc223ea3e652b1"} Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.485865 4696 scope.go:117] "RemoveContainer" containerID="f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.489026 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.489072 4696 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="79ebfd3a-5a9e-4a60-b0d7-52be6d0b540a" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.491955 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fqzv2" podStartSLOduration=2.491944451 podStartE2EDuration="2.491944451s" podCreationTimestamp="2025-12-02 22:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:48:13.484678897 +0000 UTC m=+356.365358898" watchObservedRunningTime="2025-12-02 22:48:13.491944451 +0000 UTC m=+356.372624452" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.492672 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.492720 4696 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="79ebfd3a-5a9e-4a60-b0d7-52be6d0b540a" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.515534 4696 scope.go:117] "RemoveContainer" containerID="f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da" Dec 02 22:48:13 crc kubenswrapper[4696]: E1202 22:48:13.516160 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da\": container with ID starting with f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da not found: ID does not exist" containerID="f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da" Dec 02 22:48:13 crc kubenswrapper[4696]: I1202 22:48:13.516197 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da"} err="failed to get container status \"f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da\": rpc error: code = NotFound desc = could not find container \"f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da\": container with ID starting with f0e6c380c1399055c2383fe4c4b02d64629a8b697ff784046c6453b0486dd8da not found: ID does not exist" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.936419 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wq2sm"] Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937067 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937081 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937092 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937098 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937105 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937111 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937120 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937125 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937137 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937142 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937154 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937159 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937167 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937172 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937181 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937186 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937195 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937200 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937210 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937215 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937222 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937228 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="extract-utilities" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937236 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937241 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="extract-content" Dec 02 22:48:14 crc kubenswrapper[4696]: E1202 22:48:14.937250 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937256 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937339 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7e1f5a-e61a-4603-b7d6-a7baccc8c59c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937348 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e0ae4b-0dce-4a74-8d3c-05635e86392b" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937354 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaa03bb-5cf7-4a89-b182-36358f9e247c" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937362 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4122e2-7532-4ca9-a111-9097cfae1dde" containerName="registry-server" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.937373 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="db82b3f2-cd94-42ef-a662-8a4b4c8fac85" containerName="marketplace-operator" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.942349 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.945061 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.952781 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq2sm"] Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.987537 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-utilities\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.987609 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnlj5\" (UniqueName: \"kubernetes.io/projected/74551327-8467-4679-9951-5dd7042e2a45-kube-api-access-hnlj5\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:14 crc kubenswrapper[4696]: I1202 22:48:14.987714 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-catalog-content\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.089619 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-catalog-content\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.089866 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-utilities\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.089907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnlj5\" (UniqueName: \"kubernetes.io/projected/74551327-8467-4679-9951-5dd7042e2a45-kube-api-access-hnlj5\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.090344 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-catalog-content\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.090507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74551327-8467-4679-9951-5dd7042e2a45-utilities\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.119190 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnlj5\" (UniqueName: \"kubernetes.io/projected/74551327-8467-4679-9951-5dd7042e2a45-kube-api-access-hnlj5\") pod \"certified-operators-wq2sm\" (UID: \"74551327-8467-4679-9951-5dd7042e2a45\") " pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.134376 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lq2n7"] Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.135538 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.137410 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.144331 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lq2n7"] Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.191240 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-utilities\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.191299 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-catalog-content\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.191326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9wv2\" (UniqueName: \"kubernetes.io/projected/288c2740-d410-436e-a43f-e9522208e1f1-kube-api-access-d9wv2\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.259638 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.292332 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-utilities\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.292467 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-catalog-content\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.292509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9wv2\" (UniqueName: \"kubernetes.io/projected/288c2740-d410-436e-a43f-e9522208e1f1-kube-api-access-d9wv2\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.293372 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-utilities\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.293814 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288c2740-d410-436e-a43f-e9522208e1f1-catalog-content\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.315872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9wv2\" (UniqueName: \"kubernetes.io/projected/288c2740-d410-436e-a43f-e9522208e1f1-kube-api-access-d9wv2\") pod \"community-operators-lq2n7\" (UID: \"288c2740-d410-436e-a43f-e9522208e1f1\") " pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.470844 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.695229 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq2sm"] Dec 02 22:48:15 crc kubenswrapper[4696]: W1202 22:48:15.699749 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74551327_8467_4679_9951_5dd7042e2a45.slice/crio-dea34be8e64b4cb61d214814f53f47e2cbd6258daa78dc3f8a2f4435eb3a68c3 WatchSource:0}: Error finding container dea34be8e64b4cb61d214814f53f47e2cbd6258daa78dc3f8a2f4435eb3a68c3: Status 404 returned error can't find the container with id dea34be8e64b4cb61d214814f53f47e2cbd6258daa78dc3f8a2f4435eb3a68c3 Dec 02 22:48:15 crc kubenswrapper[4696]: I1202 22:48:15.938608 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lq2n7"] Dec 02 22:48:15 crc kubenswrapper[4696]: W1202 22:48:15.946202 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod288c2740_d410_436e_a43f_e9522208e1f1.slice/crio-7d4c728412ec81c567981a319a7bc31a33764fc2cdd6f5c60d482f98752ec1c7 WatchSource:0}: Error finding container 7d4c728412ec81c567981a319a7bc31a33764fc2cdd6f5c60d482f98752ec1c7: Status 404 returned error can't find the container with id 7d4c728412ec81c567981a319a7bc31a33764fc2cdd6f5c60d482f98752ec1c7 Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.520758 4696 generic.go:334] "Generic (PLEG): container finished" podID="288c2740-d410-436e-a43f-e9522208e1f1" containerID="48ae76df5d8067bb088371d2be83b594d857aa74539d4beb59b4a7345cc8ceb3" exitCode=0 Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.520877 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq2n7" event={"ID":"288c2740-d410-436e-a43f-e9522208e1f1","Type":"ContainerDied","Data":"48ae76df5d8067bb088371d2be83b594d857aa74539d4beb59b4a7345cc8ceb3"} Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.520916 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq2n7" event={"ID":"288c2740-d410-436e-a43f-e9522208e1f1","Type":"ContainerStarted","Data":"7d4c728412ec81c567981a319a7bc31a33764fc2cdd6f5c60d482f98752ec1c7"} Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.523233 4696 generic.go:334] "Generic (PLEG): container finished" podID="74551327-8467-4679-9951-5dd7042e2a45" containerID="a6d91e8716d9357f93df509434f1a102097bd0f30fdd52524d4a0583031eaff9" exitCode=0 Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.523272 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2sm" event={"ID":"74551327-8467-4679-9951-5dd7042e2a45","Type":"ContainerDied","Data":"a6d91e8716d9357f93df509434f1a102097bd0f30fdd52524d4a0583031eaff9"} Dec 02 22:48:16 crc kubenswrapper[4696]: I1202 22:48:16.523302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2sm" event={"ID":"74551327-8467-4679-9951-5dd7042e2a45","Type":"ContainerStarted","Data":"dea34be8e64b4cb61d214814f53f47e2cbd6258daa78dc3f8a2f4435eb3a68c3"} Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.345376 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tnm82"] Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.346663 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.350083 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.381234 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnm82"] Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.439470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-catalog-content\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.439523 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-utilities\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.439551 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dm9n\" (UniqueName: \"kubernetes.io/projected/f3c3ea23-4c15-4817-a777-29afe63f580f-kube-api-access-7dm9n\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.534758 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nl95f"] Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.537588 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.540217 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.541646 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-catalog-content\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.541717 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-utilities\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.541770 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dm9n\" (UniqueName: \"kubernetes.io/projected/f3c3ea23-4c15-4817-a777-29afe63f580f-kube-api-access-7dm9n\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.542230 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-catalog-content\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.542517 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ea23-4c15-4817-a777-29afe63f580f-utilities\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.548255 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl95f"] Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.569807 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dm9n\" (UniqueName: \"kubernetes.io/projected/f3c3ea23-4c15-4817-a777-29afe63f580f-kube-api-access-7dm9n\") pod \"redhat-marketplace-tnm82\" (UID: \"f3c3ea23-4c15-4817-a777-29afe63f580f\") " pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.642830 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-utilities\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.642869 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-catalog-content\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.642916 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nd9c\" (UniqueName: \"kubernetes.io/projected/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-kube-api-access-7nd9c\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.671602 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.679238 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.743804 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-utilities\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.743871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-catalog-content\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.743934 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nd9c\" (UniqueName: \"kubernetes.io/projected/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-kube-api-access-7nd9c\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.744458 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-utilities\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.744560 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-catalog-content\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.761037 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nd9c\" (UniqueName: \"kubernetes.io/projected/cae1c862-711b-4fe3-b6a3-f2fefa39b14c-kube-api-access-7nd9c\") pod \"redhat-operators-nl95f\" (UID: \"cae1c862-711b-4fe3-b6a3-f2fefa39b14c\") " pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:17 crc kubenswrapper[4696]: I1202 22:48:17.867553 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.136825 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tnm82"] Dec 02 22:48:18 crc kubenswrapper[4696]: W1202 22:48:18.140930 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c3ea23_4c15_4817_a777_29afe63f580f.slice/crio-239939e9fc32374ad9616a65c2c08bdfb7d16860ca2cd8d617ba186428d0424d WatchSource:0}: Error finding container 239939e9fc32374ad9616a65c2c08bdfb7d16860ca2cd8d617ba186428d0424d: Status 404 returned error can't find the container with id 239939e9fc32374ad9616a65c2c08bdfb7d16860ca2cd8d617ba186428d0424d Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.277615 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl95f"] Dec 02 22:48:18 crc kubenswrapper[4696]: W1202 22:48:18.468995 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae1c862_711b_4fe3_b6a3_f2fefa39b14c.slice/crio-963966c0c2d36875911c572a756058b963dbe4ec7b109efc50ab3ffb9337ffcf WatchSource:0}: Error finding container 963966c0c2d36875911c572a756058b963dbe4ec7b109efc50ab3ffb9337ffcf: Status 404 returned error can't find the container with id 963966c0c2d36875911c572a756058b963dbe4ec7b109efc50ab3ffb9337ffcf Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.542879 4696 generic.go:334] "Generic (PLEG): container finished" podID="74551327-8467-4679-9951-5dd7042e2a45" containerID="243aeb43434e8a1fc0a9cb6f4f1a1591ac6d4856161b3b4400f2cdd8b80aeeba" exitCode=0 Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.542944 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2sm" event={"ID":"74551327-8467-4679-9951-5dd7042e2a45","Type":"ContainerDied","Data":"243aeb43434e8a1fc0a9cb6f4f1a1591ac6d4856161b3b4400f2cdd8b80aeeba"} Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.544885 4696 generic.go:334] "Generic (PLEG): container finished" podID="f3c3ea23-4c15-4817-a777-29afe63f580f" containerID="88de65f6e4a5af95b77326a4d970f45944871d9861b5b97e0a25d0b9295d1e34" exitCode=0 Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.545531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnm82" event={"ID":"f3c3ea23-4c15-4817-a777-29afe63f580f","Type":"ContainerDied","Data":"88de65f6e4a5af95b77326a4d970f45944871d9861b5b97e0a25d0b9295d1e34"} Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.545602 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnm82" event={"ID":"f3c3ea23-4c15-4817-a777-29afe63f580f","Type":"ContainerStarted","Data":"239939e9fc32374ad9616a65c2c08bdfb7d16860ca2cd8d617ba186428d0424d"} Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.550363 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl95f" event={"ID":"cae1c862-711b-4fe3-b6a3-f2fefa39b14c","Type":"ContainerStarted","Data":"963966c0c2d36875911c572a756058b963dbe4ec7b109efc50ab3ffb9337ffcf"} Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.553172 4696 generic.go:334] "Generic (PLEG): container finished" podID="288c2740-d410-436e-a43f-e9522208e1f1" containerID="3002ad4c35fcd9a4425145eae97029a4b8ca1351e7c2985d1ab52a5d50648504" exitCode=0 Dec 02 22:48:18 crc kubenswrapper[4696]: I1202 22:48:18.553233 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq2n7" event={"ID":"288c2740-d410-436e-a43f-e9522208e1f1","Type":"ContainerDied","Data":"3002ad4c35fcd9a4425145eae97029a4b8ca1351e7c2985d1ab52a5d50648504"} Dec 02 22:48:19 crc kubenswrapper[4696]: I1202 22:48:19.567464 4696 generic.go:334] "Generic (PLEG): container finished" podID="cae1c862-711b-4fe3-b6a3-f2fefa39b14c" containerID="9020626f2cda61237810bfcf77060f02b6c4417d7b8f5d8ca5b442d1a3a9fb20" exitCode=0 Dec 02 22:48:19 crc kubenswrapper[4696]: I1202 22:48:19.567692 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl95f" event={"ID":"cae1c862-711b-4fe3-b6a3-f2fefa39b14c","Type":"ContainerDied","Data":"9020626f2cda61237810bfcf77060f02b6c4417d7b8f5d8ca5b442d1a3a9fb20"} Dec 02 22:48:20 crc kubenswrapper[4696]: I1202 22:48:20.578771 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq2n7" event={"ID":"288c2740-d410-436e-a43f-e9522208e1f1","Type":"ContainerStarted","Data":"fbffd72278f622673f0272223a9955c388a7aabfb566179329a3dfd23d3d4674"} Dec 02 22:48:20 crc kubenswrapper[4696]: I1202 22:48:20.582443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2sm" event={"ID":"74551327-8467-4679-9951-5dd7042e2a45","Type":"ContainerStarted","Data":"098cd0737caf6cd157a25d6946571fc02d5de679b35fd76eaac7705e1944f572"} Dec 02 22:48:20 crc kubenswrapper[4696]: I1202 22:48:20.609345 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lq2n7" podStartSLOduration=3.206047284 podStartE2EDuration="5.609318589s" podCreationTimestamp="2025-12-02 22:48:15 +0000 UTC" firstStartedPulling="2025-12-02 22:48:16.523859813 +0000 UTC m=+359.404539814" lastFinishedPulling="2025-12-02 22:48:18.927131118 +0000 UTC m=+361.807811119" observedRunningTime="2025-12-02 22:48:20.604858337 +0000 UTC m=+363.485538338" watchObservedRunningTime="2025-12-02 22:48:20.609318589 +0000 UTC m=+363.489998590" Dec 02 22:48:20 crc kubenswrapper[4696]: I1202 22:48:20.628890 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wq2sm" podStartSLOduration=4.251463459 podStartE2EDuration="6.628865993s" podCreationTimestamp="2025-12-02 22:48:14 +0000 UTC" firstStartedPulling="2025-12-02 22:48:16.534236939 +0000 UTC m=+359.414916940" lastFinishedPulling="2025-12-02 22:48:18.911639473 +0000 UTC m=+361.792319474" observedRunningTime="2025-12-02 22:48:20.625113093 +0000 UTC m=+363.505793094" watchObservedRunningTime="2025-12-02 22:48:20.628865993 +0000 UTC m=+363.509545994" Dec 02 22:48:21 crc kubenswrapper[4696]: I1202 22:48:21.590108 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnm82" event={"ID":"f3c3ea23-4c15-4817-a777-29afe63f580f","Type":"ContainerStarted","Data":"364eaad5265dc8bb488c074f6614c3ccfe3ba91ca269b240806d960403463dc1"} Dec 02 22:48:21 crc kubenswrapper[4696]: I1202 22:48:21.592245 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl95f" event={"ID":"cae1c862-711b-4fe3-b6a3-f2fefa39b14c","Type":"ContainerStarted","Data":"0b5ddb74d82ef2fe66a796b12e28bfc903aaa761cf61bddc9c78ec451ce2cf3a"} Dec 02 22:48:21 crc kubenswrapper[4696]: E1202 22:48:21.683818 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c3ea23_4c15_4817_a777_29afe63f580f.slice/crio-364eaad5265dc8bb488c074f6614c3ccfe3ba91ca269b240806d960403463dc1.scope\": RecentStats: unable to find data in memory cache]" Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.602866 4696 generic.go:334] "Generic (PLEG): container finished" podID="f3c3ea23-4c15-4817-a777-29afe63f580f" containerID="364eaad5265dc8bb488c074f6614c3ccfe3ba91ca269b240806d960403463dc1" exitCode=0 Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.602956 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnm82" event={"ID":"f3c3ea23-4c15-4817-a777-29afe63f580f","Type":"ContainerDied","Data":"364eaad5265dc8bb488c074f6614c3ccfe3ba91ca269b240806d960403463dc1"} Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.605625 4696 generic.go:334] "Generic (PLEG): container finished" podID="cae1c862-711b-4fe3-b6a3-f2fefa39b14c" containerID="0b5ddb74d82ef2fe66a796b12e28bfc903aaa761cf61bddc9c78ec451ce2cf3a" exitCode=0 Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.605698 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl95f" event={"ID":"cae1c862-711b-4fe3-b6a3-f2fefa39b14c","Type":"ContainerDied","Data":"0b5ddb74d82ef2fe66a796b12e28bfc903aaa761cf61bddc9c78ec451ce2cf3a"} Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.974128 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:48:22 crc kubenswrapper[4696]: I1202 22:48:22.974205 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.260814 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.262546 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.310924 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.471400 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.471466 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.522015 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.629323 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl95f" event={"ID":"cae1c862-711b-4fe3-b6a3-f2fefa39b14c","Type":"ContainerStarted","Data":"0d82281eefcdc42191804a5f770352a969ed09d5bc9823e765101d31934761d2"} Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.661131 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nl95f" podStartSLOduration=4.665788287 podStartE2EDuration="8.661109221s" podCreationTimestamp="2025-12-02 22:48:17 +0000 UTC" firstStartedPulling="2025-12-02 22:48:19.569929772 +0000 UTC m=+362.450609823" lastFinishedPulling="2025-12-02 22:48:23.565250746 +0000 UTC m=+366.445930757" observedRunningTime="2025-12-02 22:48:25.659483943 +0000 UTC m=+368.540163964" watchObservedRunningTime="2025-12-02 22:48:25.661109221 +0000 UTC m=+368.541789222" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.682173 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wq2sm" Dec 02 22:48:25 crc kubenswrapper[4696]: I1202 22:48:25.689149 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lq2n7" Dec 02 22:48:26 crc kubenswrapper[4696]: I1202 22:48:26.638498 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tnm82" event={"ID":"f3c3ea23-4c15-4817-a777-29afe63f580f","Type":"ContainerStarted","Data":"9ca36a35dc4f5834f37fc5ac5e3ba82c53c3f03d57a956dca26dbc9a59291f0c"} Dec 02 22:48:26 crc kubenswrapper[4696]: I1202 22:48:26.659719 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tnm82" podStartSLOduration=3.125535711 podStartE2EDuration="9.659701908s" podCreationTimestamp="2025-12-02 22:48:17 +0000 UTC" firstStartedPulling="2025-12-02 22:48:18.54737829 +0000 UTC m=+361.428058291" lastFinishedPulling="2025-12-02 22:48:25.081544497 +0000 UTC m=+367.962224488" observedRunningTime="2025-12-02 22:48:26.658708938 +0000 UTC m=+369.539388949" watchObservedRunningTime="2025-12-02 22:48:26.659701908 +0000 UTC m=+369.540381919" Dec 02 22:48:27 crc kubenswrapper[4696]: I1202 22:48:27.679357 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:27 crc kubenswrapper[4696]: I1202 22:48:27.679413 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:27 crc kubenswrapper[4696]: I1202 22:48:27.729408 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:27 crc kubenswrapper[4696]: I1202 22:48:27.868773 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:27 crc kubenswrapper[4696]: I1202 22:48:27.868832 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:28 crc kubenswrapper[4696]: I1202 22:48:28.910268 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nl95f" podUID="cae1c862-711b-4fe3-b6a3-f2fefa39b14c" containerName="registry-server" probeResult="failure" output=< Dec 02 22:48:28 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 22:48:28 crc kubenswrapper[4696]: > Dec 02 22:48:37 crc kubenswrapper[4696]: I1202 22:48:37.755575 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tnm82" Dec 02 22:48:37 crc kubenswrapper[4696]: I1202 22:48:37.916216 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:37 crc kubenswrapper[4696]: I1202 22:48:37.967703 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nl95f" Dec 02 22:48:52 crc kubenswrapper[4696]: I1202 22:48:52.974694 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:48:52 crc kubenswrapper[4696]: I1202 22:48:52.977403 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.489148 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zw9xh"] Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.490388 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.509777 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zw9xh"] Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.645698 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-trusted-ca\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.645898 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-certificates\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646006 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-tls\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646072 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d0eb5c-275a-4636-8e6d-423036fe6cac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646110 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d0eb5c-275a-4636-8e6d-423036fe6cac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646265 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646419 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-bound-sa-token\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.646567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p958g\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-kube-api-access-p958g\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.674241 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.749954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d0eb5c-275a-4636-8e6d-423036fe6cac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750484 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d0eb5c-275a-4636-8e6d-423036fe6cac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750554 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-bound-sa-token\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750628 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p958g\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-kube-api-access-p958g\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-trusted-ca\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750789 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-certificates\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.750837 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-tls\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.751901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d0eb5c-275a-4636-8e6d-423036fe6cac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.753051 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-certificates\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.753124 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d0eb5c-275a-4636-8e6d-423036fe6cac-trusted-ca\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.758504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-registry-tls\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.760253 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d0eb5c-275a-4636-8e6d-423036fe6cac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.775555 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p958g\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-kube-api-access-p958g\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.775864 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d0eb5c-275a-4636-8e6d-423036fe6cac-bound-sa-token\") pod \"image-registry-66df7c8f76-zw9xh\" (UID: \"74d0eb5c-275a-4636-8e6d-423036fe6cac\") " pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:54 crc kubenswrapper[4696]: I1202 22:48:54.859894 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:55 crc kubenswrapper[4696]: I1202 22:48:55.371769 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zw9xh"] Dec 02 22:48:55 crc kubenswrapper[4696]: I1202 22:48:55.810320 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" event={"ID":"74d0eb5c-275a-4636-8e6d-423036fe6cac","Type":"ContainerStarted","Data":"8f8251e7a970b16403b923b62f30306d94f802f8ff4a1dad345cec7e38ba51c0"} Dec 02 22:48:55 crc kubenswrapper[4696]: I1202 22:48:55.810861 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:48:55 crc kubenswrapper[4696]: I1202 22:48:55.810875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" event={"ID":"74d0eb5c-275a-4636-8e6d-423036fe6cac","Type":"ContainerStarted","Data":"6ac5b45422df089c163608fc2f282b030d9d361384fcf9a831e039da67a9bd21"} Dec 02 22:48:55 crc kubenswrapper[4696]: I1202 22:48:55.832857 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" podStartSLOduration=1.8328352460000001 podStartE2EDuration="1.832835246s" podCreationTimestamp="2025-12-02 22:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:48:55.828326523 +0000 UTC m=+398.709006524" watchObservedRunningTime="2025-12-02 22:48:55.832835246 +0000 UTC m=+398.713515247" Dec 02 22:49:05 crc kubenswrapper[4696]: I1202 22:49:05.161182 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:49:05 crc kubenswrapper[4696]: I1202 22:49:05.162299 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" podUID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" containerName="controller-manager" containerID="cri-o://d093b727e4acd202062f725755d73a84976d16c248b9fa6208a19a9a8ff5741d" gracePeriod=30 Dec 02 22:49:05 crc kubenswrapper[4696]: I1202 22:49:05.893447 4696 generic.go:334] "Generic (PLEG): container finished" podID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" containerID="d093b727e4acd202062f725755d73a84976d16c248b9fa6208a19a9a8ff5741d" exitCode=0 Dec 02 22:49:05 crc kubenswrapper[4696]: I1202 22:49:05.893539 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" event={"ID":"b60ca874-ef00-4e4a-8b2e-26a1581d4c86","Type":"ContainerDied","Data":"d093b727e4acd202062f725755d73a84976d16c248b9fa6208a19a9a8ff5741d"} Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.166868 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.244730 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config\") pod \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.244862 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles\") pod \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.244986 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert\") pod \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.245058 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96qt2\" (UniqueName: \"kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2\") pod \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.245137 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca\") pod \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\" (UID: \"b60ca874-ef00-4e4a-8b2e-26a1581d4c86\") " Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.247180 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b60ca874-ef00-4e4a-8b2e-26a1581d4c86" (UID: "b60ca874-ef00-4e4a-8b2e-26a1581d4c86"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.247820 4696 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.248267 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config" (OuterVolumeSpecName: "config") pod "b60ca874-ef00-4e4a-8b2e-26a1581d4c86" (UID: "b60ca874-ef00-4e4a-8b2e-26a1581d4c86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.248648 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca" (OuterVolumeSpecName: "client-ca") pod "b60ca874-ef00-4e4a-8b2e-26a1581d4c86" (UID: "b60ca874-ef00-4e4a-8b2e-26a1581d4c86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.253815 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b60ca874-ef00-4e4a-8b2e-26a1581d4c86" (UID: "b60ca874-ef00-4e4a-8b2e-26a1581d4c86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.255002 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2" (OuterVolumeSpecName: "kube-api-access-96qt2") pod "b60ca874-ef00-4e4a-8b2e-26a1581d4c86" (UID: "b60ca874-ef00-4e4a-8b2e-26a1581d4c86"). InnerVolumeSpecName "kube-api-access-96qt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.349472 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.349537 4696 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.349560 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96qt2\" (UniqueName: \"kubernetes.io/projected/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-kube-api-access-96qt2\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.349582 4696 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b60ca874-ef00-4e4a-8b2e-26a1581d4c86-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.602633 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7695c7677d-2tsxm"] Dec 02 22:49:06 crc kubenswrapper[4696]: E1202 22:49:06.603149 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" containerName="controller-manager" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.603178 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" containerName="controller-manager" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.603364 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" containerName="controller-manager" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.605109 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.617685 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7695c7677d-2tsxm"] Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.654192 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8mps\" (UniqueName: \"kubernetes.io/projected/040f0cf2-b3a6-4696-a7f4-5e12108183a3-kube-api-access-j8mps\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.654313 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-proxy-ca-bundles\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.654383 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-config\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.654489 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f0cf2-b3a6-4696-a7f4-5e12108183a3-serving-cert\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.654654 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-client-ca\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.756508 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f0cf2-b3a6-4696-a7f4-5e12108183a3-serving-cert\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.756801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-client-ca\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.756911 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mps\" (UniqueName: \"kubernetes.io/projected/040f0cf2-b3a6-4696-a7f4-5e12108183a3-kube-api-access-j8mps\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.756965 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-proxy-ca-bundles\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.756998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-config\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.760216 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-proxy-ca-bundles\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.761259 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-client-ca\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.763327 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f0cf2-b3a6-4696-a7f4-5e12108183a3-config\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.763589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f0cf2-b3a6-4696-a7f4-5e12108183a3-serving-cert\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.787059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mps\" (UniqueName: \"kubernetes.io/projected/040f0cf2-b3a6-4696-a7f4-5e12108183a3-kube-api-access-j8mps\") pod \"controller-manager-7695c7677d-2tsxm\" (UID: \"040f0cf2-b3a6-4696-a7f4-5e12108183a3\") " pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.904732 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" event={"ID":"b60ca874-ef00-4e4a-8b2e-26a1581d4c86","Type":"ContainerDied","Data":"dc16537831c631e17f6ee0e9b7d331f95fd2e8342c6a1d195a2da2c90c7b9639"} Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.904859 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d48c458cb-mhg55" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.904881 4696 scope.go:117] "RemoveContainer" containerID="d093b727e4acd202062f725755d73a84976d16c248b9fa6208a19a9a8ff5741d" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.943290 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.965271 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:49:06 crc kubenswrapper[4696]: I1202 22:49:06.971294 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d48c458cb-mhg55"] Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.252478 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7695c7677d-2tsxm"] Dec 02 22:49:07 crc kubenswrapper[4696]: W1202 22:49:07.255051 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040f0cf2_b3a6_4696_a7f4_5e12108183a3.slice/crio-ed497568a243d37744227701d5b69288049e6b80c5b64eabbe222a0d51c89aa6 WatchSource:0}: Error finding container ed497568a243d37744227701d5b69288049e6b80c5b64eabbe222a0d51c89aa6: Status 404 returned error can't find the container with id ed497568a243d37744227701d5b69288049e6b80c5b64eabbe222a0d51c89aa6 Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.444922 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60ca874-ef00-4e4a-8b2e-26a1581d4c86" path="/var/lib/kubelet/pods/b60ca874-ef00-4e4a-8b2e-26a1581d4c86/volumes" Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.914756 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" event={"ID":"040f0cf2-b3a6-4696-a7f4-5e12108183a3","Type":"ContainerStarted","Data":"8b85673262be88d52ddd866b0da20c42df51fe1376a29f03d8abf5ba41536a8e"} Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.914826 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" event={"ID":"040f0cf2-b3a6-4696-a7f4-5e12108183a3","Type":"ContainerStarted","Data":"ed497568a243d37744227701d5b69288049e6b80c5b64eabbe222a0d51c89aa6"} Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.915337 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.924423 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" Dec 02 22:49:07 crc kubenswrapper[4696]: I1202 22:49:07.946551 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7695c7677d-2tsxm" podStartSLOduration=2.946527184 podStartE2EDuration="2.946527184s" podCreationTimestamp="2025-12-02 22:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:49:07.94435395 +0000 UTC m=+410.825033961" watchObservedRunningTime="2025-12-02 22:49:07.946527184 +0000 UTC m=+410.827207195" Dec 02 22:49:14 crc kubenswrapper[4696]: I1202 22:49:14.869197 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zw9xh" Dec 02 22:49:14 crc kubenswrapper[4696]: I1202 22:49:14.951911 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:49:22 crc kubenswrapper[4696]: I1202 22:49:22.973685 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:49:22 crc kubenswrapper[4696]: I1202 22:49:22.974560 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:49:22 crc kubenswrapper[4696]: I1202 22:49:22.974653 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:49:22 crc kubenswrapper[4696]: I1202 22:49:22.975783 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 22:49:22 crc kubenswrapper[4696]: I1202 22:49:22.975960 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f" gracePeriod=600 Dec 02 22:49:24 crc kubenswrapper[4696]: I1202 22:49:24.044397 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f" exitCode=0 Dec 02 22:49:24 crc kubenswrapper[4696]: I1202 22:49:24.044542 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f"} Dec 02 22:49:24 crc kubenswrapper[4696]: I1202 22:49:24.045120 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02"} Dec 02 22:49:24 crc kubenswrapper[4696]: I1202 22:49:24.045236 4696 scope.go:117] "RemoveContainer" containerID="fa016430a0a36d0628e0e5b0c4baf2fa724c888116e5d70517c8ffc4be1c37a4" Dec 02 22:49:40 crc kubenswrapper[4696]: I1202 22:49:40.010244 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" podUID="24488e8a-3522-4214-ab83-684d76eb1501" containerName="registry" containerID="cri-o://0ee8494271b38f77e7784c7aeccd2db24ace23e233f86fc0a594380408c0289a" gracePeriod=30 Dec 02 22:49:40 crc kubenswrapper[4696]: I1202 22:49:40.170267 4696 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-rjkp2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.31:5000/healthz\": dial tcp 10.217.0.31:5000: connect: connection refused" start-of-body= Dec 02 22:49:40 crc kubenswrapper[4696]: I1202 22:49:40.170928 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" podUID="24488e8a-3522-4214-ab83-684d76eb1501" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.31:5000/healthz\": dial tcp 10.217.0.31:5000: connect: connection refused" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.168212 4696 generic.go:334] "Generic (PLEG): container finished" podID="24488e8a-3522-4214-ab83-684d76eb1501" containerID="0ee8494271b38f77e7784c7aeccd2db24ace23e233f86fc0a594380408c0289a" exitCode=0 Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.168395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" event={"ID":"24488e8a-3522-4214-ab83-684d76eb1501","Type":"ContainerDied","Data":"0ee8494271b38f77e7784c7aeccd2db24ace23e233f86fc0a594380408c0289a"} Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.168688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" event={"ID":"24488e8a-3522-4214-ab83-684d76eb1501","Type":"ContainerDied","Data":"4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61"} Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.168719 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb6fb85cb0ed4269de34a8f2e7b3f1f6bc363fe145cffa22b25f8c5e0644b61" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.209441 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336535 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336589 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336616 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336667 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336821 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336843 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvkj5\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.336890 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls\") pod \"24488e8a-3522-4214-ab83-684d76eb1501\" (UID: \"24488e8a-3522-4214-ab83-684d76eb1501\") " Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.338968 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.338995 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.345211 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.347755 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.347726 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.348191 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5" (OuterVolumeSpecName: "kube-api-access-tvkj5") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "kube-api-access-tvkj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.354991 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.372817 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "24488e8a-3522-4214-ab83-684d76eb1501" (UID: "24488e8a-3522-4214-ab83-684d76eb1501"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.440248 4696 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24488e8a-3522-4214-ab83-684d76eb1501-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.441022 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvkj5\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-kube-api-access-tvkj5\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.441223 4696 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.441365 4696 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24488e8a-3522-4214-ab83-684d76eb1501-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.441498 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.441658 4696 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24488e8a-3522-4214-ab83-684d76eb1501-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:41 crc kubenswrapper[4696]: I1202 22:49:41.448460 4696 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24488e8a-3522-4214-ab83-684d76eb1501-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 22:49:42 crc kubenswrapper[4696]: I1202 22:49:42.176633 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rjkp2" Dec 02 22:49:42 crc kubenswrapper[4696]: I1202 22:49:42.211404 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:49:42 crc kubenswrapper[4696]: I1202 22:49:42.231581 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rjkp2"] Dec 02 22:49:43 crc kubenswrapper[4696]: I1202 22:49:43.440439 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24488e8a-3522-4214-ab83-684d76eb1501" path="/var/lib/kubelet/pods/24488e8a-3522-4214-ab83-684d76eb1501/volumes" Dec 02 22:51:17 crc kubenswrapper[4696]: I1202 22:51:17.674797 4696 scope.go:117] "RemoveContainer" containerID="0ee8494271b38f77e7784c7aeccd2db24ace23e233f86fc0a594380408c0289a" Dec 02 22:51:52 crc kubenswrapper[4696]: I1202 22:51:52.974628 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:51:52 crc kubenswrapper[4696]: I1202 22:51:52.975467 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:52:22 crc kubenswrapper[4696]: I1202 22:52:22.974452 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:52:22 crc kubenswrapper[4696]: I1202 22:52:22.975556 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:52:52 crc kubenswrapper[4696]: I1202 22:52:52.974060 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:52:52 crc kubenswrapper[4696]: I1202 22:52:52.974995 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:52:52 crc kubenswrapper[4696]: I1202 22:52:52.975072 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:52:52 crc kubenswrapper[4696]: I1202 22:52:52.975950 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 22:52:52 crc kubenswrapper[4696]: I1202 22:52:52.976054 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02" gracePeriod=600 Dec 02 22:52:53 crc kubenswrapper[4696]: I1202 22:52:53.960991 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02" exitCode=0 Dec 02 22:52:53 crc kubenswrapper[4696]: I1202 22:52:53.961083 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02"} Dec 02 22:52:53 crc kubenswrapper[4696]: I1202 22:52:53.961649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976"} Dec 02 22:52:53 crc kubenswrapper[4696]: I1202 22:52:53.961690 4696 scope.go:117] "RemoveContainer" containerID="cd2d2fd1bee3bd1f6238f890b6611be55459ebdfd2de430b173a71f76f25b35f" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.629166 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-n8ntr"] Dec 02 22:53:28 crc kubenswrapper[4696]: E1202 22:53:28.631768 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24488e8a-3522-4214-ab83-684d76eb1501" containerName="registry" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.631867 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="24488e8a-3522-4214-ab83-684d76eb1501" containerName="registry" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.632084 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="24488e8a-3522-4214-ab83-684d76eb1501" containerName="registry" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.632665 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.637464 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.637839 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.637933 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rgqps" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.642775 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-n8ntr"] Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.669133 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jlqs7"] Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.671124 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.679641 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-st2ws" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.689039 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jbtws"] Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.691070 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-jbtws" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.693183 4696 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wbc6c" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.698274 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jlqs7"] Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.702380 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jbtws"] Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.783482 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkvx\" (UniqueName: \"kubernetes.io/projected/2af9e90d-fb84-4f01-9ed3-c0c1eaef6369-kube-api-access-cgkvx\") pod \"cert-manager-cainjector-7f985d654d-n8ntr\" (UID: \"2af9e90d-fb84-4f01-9ed3-c0c1eaef6369\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.783629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95b8\" (UniqueName: \"kubernetes.io/projected/72d4a613-3c9c-4b7d-a840-3c76247572f6-kube-api-access-w95b8\") pod \"cert-manager-webhook-5655c58dd6-jlqs7\" (UID: \"72d4a613-3c9c-4b7d-a840-3c76247572f6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.783664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz4j\" (UniqueName: \"kubernetes.io/projected/cbe42e42-7252-40cb-bfe8-7484eb822ff9-kube-api-access-ctz4j\") pod \"cert-manager-5b446d88c5-jbtws\" (UID: \"cbe42e42-7252-40cb-bfe8-7484eb822ff9\") " pod="cert-manager/cert-manager-5b446d88c5-jbtws" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.884622 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkvx\" (UniqueName: \"kubernetes.io/projected/2af9e90d-fb84-4f01-9ed3-c0c1eaef6369-kube-api-access-cgkvx\") pod \"cert-manager-cainjector-7f985d654d-n8ntr\" (UID: \"2af9e90d-fb84-4f01-9ed3-c0c1eaef6369\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.884817 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95b8\" (UniqueName: \"kubernetes.io/projected/72d4a613-3c9c-4b7d-a840-3c76247572f6-kube-api-access-w95b8\") pod \"cert-manager-webhook-5655c58dd6-jlqs7\" (UID: \"72d4a613-3c9c-4b7d-a840-3c76247572f6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.884856 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz4j\" (UniqueName: \"kubernetes.io/projected/cbe42e42-7252-40cb-bfe8-7484eb822ff9-kube-api-access-ctz4j\") pod \"cert-manager-5b446d88c5-jbtws\" (UID: \"cbe42e42-7252-40cb-bfe8-7484eb822ff9\") " pod="cert-manager/cert-manager-5b446d88c5-jbtws" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.904520 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkvx\" (UniqueName: \"kubernetes.io/projected/2af9e90d-fb84-4f01-9ed3-c0c1eaef6369-kube-api-access-cgkvx\") pod \"cert-manager-cainjector-7f985d654d-n8ntr\" (UID: \"2af9e90d-fb84-4f01-9ed3-c0c1eaef6369\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.904578 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95b8\" (UniqueName: \"kubernetes.io/projected/72d4a613-3c9c-4b7d-a840-3c76247572f6-kube-api-access-w95b8\") pod \"cert-manager-webhook-5655c58dd6-jlqs7\" (UID: \"72d4a613-3c9c-4b7d-a840-3c76247572f6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.905498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz4j\" (UniqueName: \"kubernetes.io/projected/cbe42e42-7252-40cb-bfe8-7484eb822ff9-kube-api-access-ctz4j\") pod \"cert-manager-5b446d88c5-jbtws\" (UID: \"cbe42e42-7252-40cb-bfe8-7484eb822ff9\") " pod="cert-manager/cert-manager-5b446d88c5-jbtws" Dec 02 22:53:28 crc kubenswrapper[4696]: I1202 22:53:28.961528 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.021005 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.031234 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-jbtws" Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.219953 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-n8ntr"] Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.242571 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.540422 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jbtws"] Dec 02 22:53:29 crc kubenswrapper[4696]: W1202 22:53:29.548985 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe42e42_7252_40cb_bfe8_7484eb822ff9.slice/crio-0d9039fe2594ea9aee8ed0367d1a0458e679b7282828d8df2f6d1fa0e43ed0f2 WatchSource:0}: Error finding container 0d9039fe2594ea9aee8ed0367d1a0458e679b7282828d8df2f6d1fa0e43ed0f2: Status 404 returned error can't find the container with id 0d9039fe2594ea9aee8ed0367d1a0458e679b7282828d8df2f6d1fa0e43ed0f2 Dec 02 22:53:29 crc kubenswrapper[4696]: I1202 22:53:29.555527 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jlqs7"] Dec 02 22:53:29 crc kubenswrapper[4696]: W1202 22:53:29.570529 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d4a613_3c9c_4b7d_a840_3c76247572f6.slice/crio-8229bea326b4679a3ffe5716afbb16ed26f72562e7d15a9de50d74a2b6575777 WatchSource:0}: Error finding container 8229bea326b4679a3ffe5716afbb16ed26f72562e7d15a9de50d74a2b6575777: Status 404 returned error can't find the container with id 8229bea326b4679a3ffe5716afbb16ed26f72562e7d15a9de50d74a2b6575777 Dec 02 22:53:30 crc kubenswrapper[4696]: I1202 22:53:30.214227 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" event={"ID":"72d4a613-3c9c-4b7d-a840-3c76247572f6","Type":"ContainerStarted","Data":"8229bea326b4679a3ffe5716afbb16ed26f72562e7d15a9de50d74a2b6575777"} Dec 02 22:53:30 crc kubenswrapper[4696]: I1202 22:53:30.216075 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-jbtws" event={"ID":"cbe42e42-7252-40cb-bfe8-7484eb822ff9","Type":"ContainerStarted","Data":"0d9039fe2594ea9aee8ed0367d1a0458e679b7282828d8df2f6d1fa0e43ed0f2"} Dec 02 22:53:30 crc kubenswrapper[4696]: I1202 22:53:30.217611 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" event={"ID":"2af9e90d-fb84-4f01-9ed3-c0c1eaef6369","Type":"ContainerStarted","Data":"d14d3e53c00bb058bff5af50fcb4d32af98b34e78a8473737ee35506f5a50bb3"} Dec 02 22:53:32 crc kubenswrapper[4696]: I1202 22:53:32.230684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" event={"ID":"2af9e90d-fb84-4f01-9ed3-c0c1eaef6369","Type":"ContainerStarted","Data":"064b2dd6ee8a0010d18c52b3c81e443e56b652d654e5cb7bbb4cad5f289cfecf"} Dec 02 22:53:32 crc kubenswrapper[4696]: I1202 22:53:32.257787 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-n8ntr" podStartSLOduration=1.990657672 podStartE2EDuration="4.257731238s" podCreationTimestamp="2025-12-02 22:53:28 +0000 UTC" firstStartedPulling="2025-12-02 22:53:29.242366588 +0000 UTC m=+672.123046589" lastFinishedPulling="2025-12-02 22:53:31.509440154 +0000 UTC m=+674.390120155" observedRunningTime="2025-12-02 22:53:32.245396015 +0000 UTC m=+675.126076016" watchObservedRunningTime="2025-12-02 22:53:32.257731238 +0000 UTC m=+675.138411259" Dec 02 22:53:34 crc kubenswrapper[4696]: I1202 22:53:34.244078 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" event={"ID":"72d4a613-3c9c-4b7d-a840-3c76247572f6","Type":"ContainerStarted","Data":"3ada4797fdc41db04b4a70143228074127bbc6f4636d0bc18ac723340b8a1812"} Dec 02 22:53:34 crc kubenswrapper[4696]: I1202 22:53:34.244531 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:34 crc kubenswrapper[4696]: I1202 22:53:34.246533 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-jbtws" event={"ID":"cbe42e42-7252-40cb-bfe8-7484eb822ff9","Type":"ContainerStarted","Data":"cd85e44a4773383723f6c7c7fa99fccc9e9bc9298f3fd69760664c2294c1e785"} Dec 02 22:53:34 crc kubenswrapper[4696]: I1202 22:53:34.264956 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" podStartSLOduration=2.53510079 podStartE2EDuration="6.264923717s" podCreationTimestamp="2025-12-02 22:53:28 +0000 UTC" firstStartedPulling="2025-12-02 22:53:29.573266363 +0000 UTC m=+672.453946364" lastFinishedPulling="2025-12-02 22:53:33.30308929 +0000 UTC m=+676.183769291" observedRunningTime="2025-12-02 22:53:34.261229571 +0000 UTC m=+677.141909602" watchObservedRunningTime="2025-12-02 22:53:34.264923717 +0000 UTC m=+677.145603748" Dec 02 22:53:34 crc kubenswrapper[4696]: I1202 22:53:34.285067 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-jbtws" podStartSLOduration=2.59178076 podStartE2EDuration="6.285045412s" podCreationTimestamp="2025-12-02 22:53:28 +0000 UTC" firstStartedPulling="2025-12-02 22:53:29.552814049 +0000 UTC m=+672.433494050" lastFinishedPulling="2025-12-02 22:53:33.246078701 +0000 UTC m=+676.126758702" observedRunningTime="2025-12-02 22:53:34.281554292 +0000 UTC m=+677.162234313" watchObservedRunningTime="2025-12-02 22:53:34.285045412 +0000 UTC m=+677.165725443" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.027294 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-jlqs7" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.276065 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qb2zq"] Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.278713 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-controller" containerID="cri-o://32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.278783 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.278974 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="northd" containerID="cri-o://22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.279002 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-node" containerID="cri-o://36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.279111 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-acl-logging" containerID="cri-o://52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.278757 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="nbdb" containerID="cri-o://64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.279123 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="sbdb" containerID="cri-o://23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.361716 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" containerID="cri-o://27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" gracePeriod=30 Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.653624 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/3.log" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.656605 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovn-acl-logging/0.log" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.657231 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovn-controller/0.log" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.657804 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.722255 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9l7sx"] Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.722778 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.722873 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.722929 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="northd" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.722977 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="northd" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723029 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723088 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723143 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723201 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723289 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723362 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723435 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="nbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723503 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="nbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723565 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723625 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723694 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-node" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723769 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-node" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723844 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kubecfg-setup" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.723911 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kubecfg-setup" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.723975 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="sbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724033 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="sbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.724092 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-acl-logging" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724162 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-acl-logging" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724338 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724414 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724483 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="sbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724547 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-acl-logging" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724613 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-node" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724678 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="nbdb" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724768 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="northd" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724848 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovn-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724922 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.724988 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.725174 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.725251 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.725429 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: E1202 22:53:39.725630 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.725706 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.725903 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerName="ovnkube-controller" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.727911 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857388 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857452 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857482 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857540 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857583 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857625 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857651 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857710 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857710 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857777 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857796 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket" (OuterVolumeSpecName: "log-socket") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857905 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857927 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.857988 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858022 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858043 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858091 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858134 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858171 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858201 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858725 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858765 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858791 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858815 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858857 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858879 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dk6j\" (UniqueName: \"kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858902 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858925 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes\") pod \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\" (UID: \"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b\") " Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858239 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash" (OuterVolumeSpecName: "host-slash") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858359 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.858676 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859095 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859118 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859227 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log" (OuterVolumeSpecName: "node-log") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859332 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859659 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-systemd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859778 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-netd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859845 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znc49\" (UniqueName: \"kubernetes.io/projected/cf492831-89d3-4583-a64a-3384b9cd6270-kube-api-access-znc49\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.859981 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-systemd-units\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860028 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-netns\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860176 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-slash\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860305 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-etc-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860346 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-log-socket\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860384 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-kubelet\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860463 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-node-log\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860517 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860556 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf492831-89d3-4583-a64a-3384b9cd6270-ovn-node-metrics-cert\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-ovn\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860659 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-config\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860686 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-env-overrides\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860713 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-bin\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-script-lib\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860805 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-var-lib-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860898 4696 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860916 4696 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860935 4696 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860952 4696 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860967 4696 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.860983 4696 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861000 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861015 4696 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861030 4696 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861045 4696 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861060 4696 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861075 4696 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861090 4696 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861104 4696 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861121 4696 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861135 4696 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.861149 4696 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.865944 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j" (OuterVolumeSpecName: "kube-api-access-4dk6j") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "kube-api-access-4dk6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.870373 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.873444 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" (UID: "c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962001 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-slash\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962140 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-etc-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962191 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-log-socket\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962228 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-etc-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962242 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-kubelet\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962319 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-kubelet\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962338 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-node-log\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962183 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-slash\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962411 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-node-log\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962344 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-log-socket\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962558 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962602 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf492831-89d3-4583-a64a-3384b9cd6270-ovn-node-metrics-cert\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-ovn\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962666 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962688 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-config\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962722 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-ovn\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962730 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-env-overrides\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962797 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-bin\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962837 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-script-lib\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962878 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-var-lib-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962921 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-systemd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962955 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-bin\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.962973 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-netd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963062 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-systemd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963115 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963159 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znc49\" (UniqueName: \"kubernetes.io/projected/cf492831-89d3-4583-a64a-3384b9cd6270-kube-api-access-znc49\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963182 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-var-lib-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963023 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-cni-netd\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963190 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-run-openvswitch\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-systemd-units\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963517 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-systemd-units\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963624 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-netns\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963679 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963817 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-script-lib\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963830 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-ovnkube-config\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963793 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.963713 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf492831-89d3-4583-a64a-3384b9cd6270-host-run-netns\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.964056 4696 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.964091 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.964092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf492831-89d3-4583-a64a-3384b9cd6270-env-overrides\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.964116 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dk6j\" (UniqueName: \"kubernetes.io/projected/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b-kube-api-access-4dk6j\") on node \"crc\" DevicePath \"\"" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.971725 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf492831-89d3-4583-a64a-3384b9cd6270-ovn-node-metrics-cert\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:39 crc kubenswrapper[4696]: I1202 22:53:39.992589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znc49\" (UniqueName: \"kubernetes.io/projected/cf492831-89d3-4583-a64a-3384b9cd6270-kube-api-access-znc49\") pod \"ovnkube-node-9l7sx\" (UID: \"cf492831-89d3-4583-a64a-3384b9cd6270\") " pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.044657 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.288970 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovnkube-controller/3.log" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.292286 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovn-acl-logging/0.log" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293142 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qb2zq_c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/ovn-controller/0.log" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293624 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293685 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293713 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293787 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293736 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293882 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293898 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293927 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293938 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293988 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" exitCode=143 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294006 4696 generic.go:334] "Generic (PLEG): container finished" podID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" exitCode=143 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.293909 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294179 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294236 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294267 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294400 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294512 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294578 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294595 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294710 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294841 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294858 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294876 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294892 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294809 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294918 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294953 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294972 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.294987 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295005 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295020 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295034 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295160 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295176 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295192 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295208 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295232 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295260 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295278 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295293 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295307 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295324 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295340 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295355 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295370 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295385 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295398 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295420 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb2zq" event={"ID":"c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b","Type":"ContainerDied","Data":"81fc9d2c388d599b51bca4c279af177a4e77c75ce58bdaccdc7ce2e7343a4c7f"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295442 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295463 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295478 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295494 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295511 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295526 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295541 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295556 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295570 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.295585 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.303528 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/2.log" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.304191 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/1.log" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.304252 4696 generic.go:334] "Generic (PLEG): container finished" podID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" containerID="953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427" exitCode=2 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.304338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerDied","Data":"953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.304379 4696 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.305286 4696 scope.go:117] "RemoveContainer" containerID="953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.305604 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wthxr_openshift-multus(86a37d2a-37c5-4fbd-b10b-f5e4706772f4)\"" pod="openshift-multus/multus-wthxr" podUID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.306260 4696 generic.go:334] "Generic (PLEG): container finished" podID="cf492831-89d3-4583-a64a-3384b9cd6270" containerID="29a6d489652c680fde699c46e095490b869ba388fec7f06eed06e6e0cd904029" exitCode=0 Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.306314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerDied","Data":"29a6d489652c680fde699c46e095490b869ba388fec7f06eed06e6e0cd904029"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.306346 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"0b01ac4b9b573203819da35c355a05eca7d5a49b6555b27ed162402118edb817"} Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.318533 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.348111 4696 scope.go:117] "RemoveContainer" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.368641 4696 scope.go:117] "RemoveContainer" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.404446 4696 scope.go:117] "RemoveContainer" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.420006 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qb2zq"] Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.423808 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qb2zq"] Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.434950 4696 scope.go:117] "RemoveContainer" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.474892 4696 scope.go:117] "RemoveContainer" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.489377 4696 scope.go:117] "RemoveContainer" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.548734 4696 scope.go:117] "RemoveContainer" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.583135 4696 scope.go:117] "RemoveContainer" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.626306 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.626971 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.627011 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} err="failed to get container status \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.627055 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.627584 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": container with ID starting with c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d not found: ID does not exist" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.627605 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} err="failed to get container status \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": rpc error: code = NotFound desc = could not find container \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": container with ID starting with c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.627618 4696 scope.go:117] "RemoveContainer" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.628271 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": container with ID starting with 23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8 not found: ID does not exist" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.628339 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} err="failed to get container status \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": rpc error: code = NotFound desc = could not find container \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": container with ID starting with 23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.628394 4696 scope.go:117] "RemoveContainer" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.628754 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": container with ID starting with 64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7 not found: ID does not exist" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.628777 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} err="failed to get container status \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": rpc error: code = NotFound desc = could not find container \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": container with ID starting with 64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.628791 4696 scope.go:117] "RemoveContainer" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.629228 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": container with ID starting with 22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9 not found: ID does not exist" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.629253 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} err="failed to get container status \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": rpc error: code = NotFound desc = could not find container \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": container with ID starting with 22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.629266 4696 scope.go:117] "RemoveContainer" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.629689 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": container with ID starting with 85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503 not found: ID does not exist" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.629735 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} err="failed to get container status \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": rpc error: code = NotFound desc = could not find container \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": container with ID starting with 85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.630010 4696 scope.go:117] "RemoveContainer" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.630638 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": container with ID starting with 36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d not found: ID does not exist" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.630694 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} err="failed to get container status \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": rpc error: code = NotFound desc = could not find container \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": container with ID starting with 36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.630730 4696 scope.go:117] "RemoveContainer" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.631261 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": container with ID starting with 52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68 not found: ID does not exist" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.631296 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} err="failed to get container status \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": rpc error: code = NotFound desc = could not find container \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": container with ID starting with 52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.631317 4696 scope.go:117] "RemoveContainer" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.632240 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": container with ID starting with 32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156 not found: ID does not exist" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.632272 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} err="failed to get container status \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": rpc error: code = NotFound desc = could not find container \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": container with ID starting with 32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.632296 4696 scope.go:117] "RemoveContainer" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: E1202 22:53:40.632679 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": container with ID starting with 492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8 not found: ID does not exist" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.632762 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} err="failed to get container status \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": rpc error: code = NotFound desc = could not find container \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": container with ID starting with 492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.632785 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.634350 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} err="failed to get container status \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.634374 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.634684 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} err="failed to get container status \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": rpc error: code = NotFound desc = could not find container \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": container with ID starting with c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.634769 4696 scope.go:117] "RemoveContainer" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635041 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} err="failed to get container status \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": rpc error: code = NotFound desc = could not find container \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": container with ID starting with 23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635064 4696 scope.go:117] "RemoveContainer" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635500 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} err="failed to get container status \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": rpc error: code = NotFound desc = could not find container \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": container with ID starting with 64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635515 4696 scope.go:117] "RemoveContainer" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635923 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} err="failed to get container status \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": rpc error: code = NotFound desc = could not find container \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": container with ID starting with 22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.635942 4696 scope.go:117] "RemoveContainer" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.636165 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} err="failed to get container status \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": rpc error: code = NotFound desc = could not find container \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": container with ID starting with 85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.636199 4696 scope.go:117] "RemoveContainer" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.636609 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} err="failed to get container status \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": rpc error: code = NotFound desc = could not find container \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": container with ID starting with 36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.636659 4696 scope.go:117] "RemoveContainer" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.637532 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} err="failed to get container status \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": rpc error: code = NotFound desc = could not find container \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": container with ID starting with 52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.637586 4696 scope.go:117] "RemoveContainer" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.637921 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} err="failed to get container status \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": rpc error: code = NotFound desc = could not find container \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": container with ID starting with 32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.637950 4696 scope.go:117] "RemoveContainer" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.638394 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} err="failed to get container status \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": rpc error: code = NotFound desc = could not find container \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": container with ID starting with 492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.638421 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.638866 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} err="failed to get container status \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.638909 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.639526 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} err="failed to get container status \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": rpc error: code = NotFound desc = could not find container \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": container with ID starting with c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.639569 4696 scope.go:117] "RemoveContainer" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.640226 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} err="failed to get container status \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": rpc error: code = NotFound desc = could not find container \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": container with ID starting with 23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.640266 4696 scope.go:117] "RemoveContainer" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.640703 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} err="failed to get container status \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": rpc error: code = NotFound desc = could not find container \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": container with ID starting with 64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.640737 4696 scope.go:117] "RemoveContainer" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.641144 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} err="failed to get container status \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": rpc error: code = NotFound desc = could not find container \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": container with ID starting with 22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.641183 4696 scope.go:117] "RemoveContainer" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.641883 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} err="failed to get container status \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": rpc error: code = NotFound desc = could not find container \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": container with ID starting with 85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.641928 4696 scope.go:117] "RemoveContainer" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.642600 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} err="failed to get container status \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": rpc error: code = NotFound desc = could not find container \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": container with ID starting with 36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.642631 4696 scope.go:117] "RemoveContainer" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.643251 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} err="failed to get container status \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": rpc error: code = NotFound desc = could not find container \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": container with ID starting with 52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.643299 4696 scope.go:117] "RemoveContainer" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.643686 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} err="failed to get container status \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": rpc error: code = NotFound desc = could not find container \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": container with ID starting with 32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.643817 4696 scope.go:117] "RemoveContainer" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644195 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} err="failed to get container status \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": rpc error: code = NotFound desc = could not find container \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": container with ID starting with 492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644246 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644625 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} err="failed to get container status \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644649 4696 scope.go:117] "RemoveContainer" containerID="c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644932 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d"} err="failed to get container status \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": rpc error: code = NotFound desc = could not find container \"c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d\": container with ID starting with c89495285a741bfa9e535eb22f1110da359462742bc321664ea98562028cf21d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.644959 4696 scope.go:117] "RemoveContainer" containerID="23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645232 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8"} err="failed to get container status \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": rpc error: code = NotFound desc = could not find container \"23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8\": container with ID starting with 23b80358b654eec7dc1057ef35a6c09f88ccd7aca7a05336a3f29e7036bd55f8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645255 4696 scope.go:117] "RemoveContainer" containerID="64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645563 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7"} err="failed to get container status \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": rpc error: code = NotFound desc = could not find container \"64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7\": container with ID starting with 64b0c2025c377ab19b3853017c94cb501f472f2485093afc60749f7739ba25d7 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645595 4696 scope.go:117] "RemoveContainer" containerID="22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645946 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9"} err="failed to get container status \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": rpc error: code = NotFound desc = could not find container \"22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9\": container with ID starting with 22ce112f515d4ef59dca4c21bdc8623e88a8a0df222fb2ed0990a2f68e3f75f9 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.645969 4696 scope.go:117] "RemoveContainer" containerID="85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646182 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503"} err="failed to get container status \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": rpc error: code = NotFound desc = could not find container \"85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503\": container with ID starting with 85fbce2a7ea0550f53c1216216022e60a4e2eae0bb1d5c88254ace025fbd4503 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646209 4696 scope.go:117] "RemoveContainer" containerID="36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646442 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d"} err="failed to get container status \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": rpc error: code = NotFound desc = could not find container \"36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d\": container with ID starting with 36feb7e9f6a6bad97e87fe54ff9b661be62f6f4962352d55bcc905b80320a48d not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646494 4696 scope.go:117] "RemoveContainer" containerID="52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646788 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68"} err="failed to get container status \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": rpc error: code = NotFound desc = could not find container \"52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68\": container with ID starting with 52f3d5529aafac7c27c963a50b0318e99d9a9ca35e142d4c273431105c495e68 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.646810 4696 scope.go:117] "RemoveContainer" containerID="32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.647051 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156"} err="failed to get container status \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": rpc error: code = NotFound desc = could not find container \"32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156\": container with ID starting with 32717a82edb718f5e690fcecd578f48279d134a5dc12f00824fa25f7ec727156 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.647099 4696 scope.go:117] "RemoveContainer" containerID="492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.647544 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8"} err="failed to get container status \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": rpc error: code = NotFound desc = could not find container \"492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8\": container with ID starting with 492cfa3fb8646c09da06d92ec8a303562da558d4d944eaf68ca95de6f5b94da8 not found: ID does not exist" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.647638 4696 scope.go:117] "RemoveContainer" containerID="27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c" Dec 02 22:53:40 crc kubenswrapper[4696]: I1202 22:53:40.648049 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c"} err="failed to get container status \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": rpc error: code = NotFound desc = could not find container \"27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c\": container with ID starting with 27ae2799ce649487a5ac78954e82e66a53c94a3525ba06300e8ffa7083c5884c not found: ID does not exist" Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.317284 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"8f709df5dfc81db67c3344fe275017c4026fdd66c0679070f7dae0091b456b0a"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.318081 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"ed6b2fc08117c9e8b76f9e0621d966c92df278d91a8affadde81ce45d3903d42"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.318142 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"68d2599f3b5bbdf3eb838f0dfb23b9dc78addb53e52b3dce5c0c194022d805d6"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.318165 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"fc5c7ae31079b1ae3c5e49cc7503daa29915291e7642e798e36c22c25e8bf891"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.318185 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"aec45237e419e6e380ff85e790c50e8265f4503e0ba1ec1547b70a5b73b70699"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.318203 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"fe58de411b8921eb76e74db4f74fa989032c025ba6128b2aba2283d8f946a19b"} Dec 02 22:53:41 crc kubenswrapper[4696]: I1202 22:53:41.445344 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b" path="/var/lib/kubelet/pods/c9feb35d-e44f-4ebe-a363-a7bffc6d1f3b/volumes" Dec 02 22:53:44 crc kubenswrapper[4696]: I1202 22:53:44.344050 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"f49de3bf2ba4ec617a809e1c4be21aa10d5ed0769841ef3ed677272117acc0d4"} Dec 02 22:53:46 crc kubenswrapper[4696]: I1202 22:53:46.361054 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" event={"ID":"cf492831-89d3-4583-a64a-3384b9cd6270","Type":"ContainerStarted","Data":"7141d5c249e0e34ba491340b73d056cce1c67a9264094544b97c6600e15b8146"} Dec 02 22:53:46 crc kubenswrapper[4696]: I1202 22:53:46.361796 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:46 crc kubenswrapper[4696]: I1202 22:53:46.361817 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:46 crc kubenswrapper[4696]: I1202 22:53:46.391491 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:46 crc kubenswrapper[4696]: I1202 22:53:46.402555 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" podStartSLOduration=7.40252933 podStartE2EDuration="7.40252933s" podCreationTimestamp="2025-12-02 22:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:53:46.397829476 +0000 UTC m=+689.278509497" watchObservedRunningTime="2025-12-02 22:53:46.40252933 +0000 UTC m=+689.283209351" Dec 02 22:53:47 crc kubenswrapper[4696]: I1202 22:53:47.366751 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:47 crc kubenswrapper[4696]: I1202 22:53:47.398171 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:53:53 crc kubenswrapper[4696]: I1202 22:53:53.434539 4696 scope.go:117] "RemoveContainer" containerID="953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427" Dec 02 22:53:53 crc kubenswrapper[4696]: E1202 22:53:53.437962 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wthxr_openshift-multus(86a37d2a-37c5-4fbd-b10b-f5e4706772f4)\"" pod="openshift-multus/multus-wthxr" podUID="86a37d2a-37c5-4fbd-b10b-f5e4706772f4" Dec 02 22:54:04 crc kubenswrapper[4696]: I1202 22:54:04.431543 4696 scope.go:117] "RemoveContainer" containerID="953a9deff8f534d3434a995707bbba83840a74ff792d977deb83c2021c2d4427" Dec 02 22:54:05 crc kubenswrapper[4696]: I1202 22:54:05.494621 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/2.log" Dec 02 22:54:05 crc kubenswrapper[4696]: I1202 22:54:05.495645 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/1.log" Dec 02 22:54:05 crc kubenswrapper[4696]: I1202 22:54:05.495721 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wthxr" event={"ID":"86a37d2a-37c5-4fbd-b10b-f5e4706772f4","Type":"ContainerStarted","Data":"692d2d471286862077f7c885f5e5ef59e6493cfc1ab3b11c26e48e07c9a426c8"} Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.047894 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4"] Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.049582 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.052492 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.062289 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4"] Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.077018 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fnm\" (UniqueName: \"kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.077088 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.077167 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.177818 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fnm\" (UniqueName: \"kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.177869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.177910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.178414 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.178579 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.199255 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fnm\" (UniqueName: \"kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.409515 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:07 crc kubenswrapper[4696]: I1202 22:54:07.865357 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4"] Dec 02 22:54:07 crc kubenswrapper[4696]: W1202 22:54:07.877472 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62846e70_8410_4122_8d1a_f05e0ac36cc9.slice/crio-a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119 WatchSource:0}: Error finding container a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119: Status 404 returned error can't find the container with id a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119 Dec 02 22:54:08 crc kubenswrapper[4696]: I1202 22:54:08.512173 4696 generic.go:334] "Generic (PLEG): container finished" podID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerID="35acdd1a4f0df0945574a59bdcab289712dac18f37ca00e332b7bd9eb62def1a" exitCode=0 Dec 02 22:54:08 crc kubenswrapper[4696]: I1202 22:54:08.512220 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerDied","Data":"35acdd1a4f0df0945574a59bdcab289712dac18f37ca00e332b7bd9eb62def1a"} Dec 02 22:54:08 crc kubenswrapper[4696]: I1202 22:54:08.512251 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerStarted","Data":"a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119"} Dec 02 22:54:10 crc kubenswrapper[4696]: I1202 22:54:10.068539 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9l7sx" Dec 02 22:54:10 crc kubenswrapper[4696]: I1202 22:54:10.539928 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerStarted","Data":"539975d419294a83fbe512b0e17f12abd0b218eed440b06cfe1d350acf9de5bf"} Dec 02 22:54:11 crc kubenswrapper[4696]: I1202 22:54:11.549360 4696 generic.go:334] "Generic (PLEG): container finished" podID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerID="539975d419294a83fbe512b0e17f12abd0b218eed440b06cfe1d350acf9de5bf" exitCode=0 Dec 02 22:54:11 crc kubenswrapper[4696]: I1202 22:54:11.549428 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerDied","Data":"539975d419294a83fbe512b0e17f12abd0b218eed440b06cfe1d350acf9de5bf"} Dec 02 22:54:12 crc kubenswrapper[4696]: I1202 22:54:12.559148 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerStarted","Data":"13a6b8b8a5d30a48fcaef4f48727ef9f71931eff2832a30186d38687380840c6"} Dec 02 22:54:13 crc kubenswrapper[4696]: I1202 22:54:13.568403 4696 generic.go:334] "Generic (PLEG): container finished" podID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerID="13a6b8b8a5d30a48fcaef4f48727ef9f71931eff2832a30186d38687380840c6" exitCode=0 Dec 02 22:54:13 crc kubenswrapper[4696]: I1202 22:54:13.568481 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerDied","Data":"13a6b8b8a5d30a48fcaef4f48727ef9f71931eff2832a30186d38687380840c6"} Dec 02 22:54:14 crc kubenswrapper[4696]: I1202 22:54:14.941437 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.042829 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8fnm\" (UniqueName: \"kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm\") pod \"62846e70-8410-4122-8d1a-f05e0ac36cc9\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.042966 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util\") pod \"62846e70-8410-4122-8d1a-f05e0ac36cc9\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.043031 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle\") pod \"62846e70-8410-4122-8d1a-f05e0ac36cc9\" (UID: \"62846e70-8410-4122-8d1a-f05e0ac36cc9\") " Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.053238 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm" (OuterVolumeSpecName: "kube-api-access-f8fnm") pod "62846e70-8410-4122-8d1a-f05e0ac36cc9" (UID: "62846e70-8410-4122-8d1a-f05e0ac36cc9"). InnerVolumeSpecName "kube-api-access-f8fnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.069499 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle" (OuterVolumeSpecName: "bundle") pod "62846e70-8410-4122-8d1a-f05e0ac36cc9" (UID: "62846e70-8410-4122-8d1a-f05e0ac36cc9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.072533 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util" (OuterVolumeSpecName: "util") pod "62846e70-8410-4122-8d1a-f05e0ac36cc9" (UID: "62846e70-8410-4122-8d1a-f05e0ac36cc9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.144490 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-util\") on node \"crc\" DevicePath \"\"" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.144531 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62846e70-8410-4122-8d1a-f05e0ac36cc9-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.144541 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8fnm\" (UniqueName: \"kubernetes.io/projected/62846e70-8410-4122-8d1a-f05e0ac36cc9-kube-api-access-f8fnm\") on node \"crc\" DevicePath \"\"" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.584725 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" event={"ID":"62846e70-8410-4122-8d1a-f05e0ac36cc9","Type":"ContainerDied","Data":"a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119"} Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.585243 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89cd4c3ee64c41a0e6a51c2dbeeba5ab60ab24524a6e641083f66308886e119" Dec 02 22:54:15 crc kubenswrapper[4696]: I1202 22:54:15.584827 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4" Dec 02 22:54:17 crc kubenswrapper[4696]: I1202 22:54:17.751311 4696 scope.go:117] "RemoveContainer" containerID="32ae69a2c1cd350e1b62b8453e9d286e105623f4b9ea8ff27ce8c9d37b055e4f" Dec 02 22:54:18 crc kubenswrapper[4696]: I1202 22:54:18.623058 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wthxr_86a37d2a-37c5-4fbd-b10b-f5e4706772f4/kube-multus/2.log" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.713456 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb"] Dec 02 22:54:25 crc kubenswrapper[4696]: E1202 22:54:25.714199 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="extract" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.714217 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="extract" Dec 02 22:54:25 crc kubenswrapper[4696]: E1202 22:54:25.714245 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="util" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.714254 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="util" Dec 02 22:54:25 crc kubenswrapper[4696]: E1202 22:54:25.714266 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="pull" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.714276 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="pull" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.714397 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="62846e70-8410-4122-8d1a-f05e0ac36cc9" containerName="extract" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.714958 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.717949 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-86nwt" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.718010 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.718068 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.727454 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb"] Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.802153 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7x8\" (UniqueName: \"kubernetes.io/projected/ea569052-61d0-4847-90f1-3e085d6a5363-kube-api-access-dd7x8\") pod \"obo-prometheus-operator-668cf9dfbb-4rjbb\" (UID: \"ea569052-61d0-4847-90f1-3e085d6a5363\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.844966 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn"] Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.845836 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.847599 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.851789 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-lpds5" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.853124 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln"] Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.854247 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.869872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln"] Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.879258 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn"] Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.903307 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7x8\" (UniqueName: \"kubernetes.io/projected/ea569052-61d0-4847-90f1-3e085d6a5363-kube-api-access-dd7x8\") pod \"obo-prometheus-operator-668cf9dfbb-4rjbb\" (UID: \"ea569052-61d0-4847-90f1-3e085d6a5363\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" Dec 02 22:54:25 crc kubenswrapper[4696]: I1202 22:54:25.948487 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7x8\" (UniqueName: \"kubernetes.io/projected/ea569052-61d0-4847-90f1-3e085d6a5363-kube-api-access-dd7x8\") pod \"obo-prometheus-operator-668cf9dfbb-4rjbb\" (UID: \"ea569052-61d0-4847-90f1-3e085d6a5363\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.004629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.004693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.004775 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.004820 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.022862 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rc522"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.023761 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.025864 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vjlb8" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.027722 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.040863 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.047238 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rc522"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107464 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107507 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee2e5e1-ad58-448a-973e-2207d5cde11b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107589 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s596b\" (UniqueName: \"kubernetes.io/projected/9ee2e5e1-ad58-448a-973e-2207d5cde11b-kube-api-access-s596b\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.107637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.114450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.114530 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.115918 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6cef468e-8250-42c5-8ae4-75dccc1b10a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn\" (UID: \"6cef468e-8250-42c5-8ae4-75dccc1b10a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.117664 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e25bae4c-bb72-4fe7-8f1b-f6e61100727c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bfc855f8c-28cln\" (UID: \"e25bae4c-bb72-4fe7-8f1b-f6e61100727c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.174123 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.183773 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.208438 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee2e5e1-ad58-448a-973e-2207d5cde11b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.208501 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s596b\" (UniqueName: \"kubernetes.io/projected/9ee2e5e1-ad58-448a-973e-2207d5cde11b-kube-api-access-s596b\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.213034 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee2e5e1-ad58-448a-973e-2207d5cde11b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.229427 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s596b\" (UniqueName: \"kubernetes.io/projected/9ee2e5e1-ad58-448a-973e-2207d5cde11b-kube-api-access-s596b\") pod \"observability-operator-d8bb48f5d-rc522\" (UID: \"9ee2e5e1-ad58-448a-973e-2207d5cde11b\") " pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.258657 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-t8wvk"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.259545 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.262287 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-d4nvp" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.265759 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-t8wvk"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.342789 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.403770 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.410647 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a6ffb2-7272-4be4-9ed2-ba78389166d6-openshift-service-ca\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.410690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5b4\" (UniqueName: \"kubernetes.io/projected/20a6ffb2-7272-4be4-9ed2-ba78389166d6-kube-api-access-mg5b4\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.511613 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5b4\" (UniqueName: \"kubernetes.io/projected/20a6ffb2-7272-4be4-9ed2-ba78389166d6-kube-api-access-mg5b4\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.511725 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a6ffb2-7272-4be4-9ed2-ba78389166d6-openshift-service-ca\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.512783 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a6ffb2-7272-4be4-9ed2-ba78389166d6-openshift-service-ca\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.554615 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5b4\" (UniqueName: \"kubernetes.io/projected/20a6ffb2-7272-4be4-9ed2-ba78389166d6-kube-api-access-mg5b4\") pod \"perses-operator-5446b9c989-t8wvk\" (UID: \"20a6ffb2-7272-4be4-9ed2-ba78389166d6\") " pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.602065 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.680978 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" event={"ID":"ea569052-61d0-4847-90f1-3e085d6a5363","Type":"ContainerStarted","Data":"8e638b26ec8422fcc065a07c735c21d98915cf39445e5e887f15fadb4e270304"} Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.705285 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.724536 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln"] Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.800212 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rc522"] Dec 02 22:54:26 crc kubenswrapper[4696]: W1202 22:54:26.813949 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee2e5e1_ad58_448a_973e_2207d5cde11b.slice/crio-a4f19dbb550f79b9a4b6059032a68e2c057acf7387ea8f580a806c5c2ef378f7 WatchSource:0}: Error finding container a4f19dbb550f79b9a4b6059032a68e2c057acf7387ea8f580a806c5c2ef378f7: Status 404 returned error can't find the container with id a4f19dbb550f79b9a4b6059032a68e2c057acf7387ea8f580a806c5c2ef378f7 Dec 02 22:54:26 crc kubenswrapper[4696]: I1202 22:54:26.969872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-t8wvk"] Dec 02 22:54:26 crc kubenswrapper[4696]: W1202 22:54:26.972778 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a6ffb2_7272_4be4_9ed2_ba78389166d6.slice/crio-b905d8557fa077c1f997763c6a23d4cba42cc1c4dd5569f4e1124029031a5408 WatchSource:0}: Error finding container b905d8557fa077c1f997763c6a23d4cba42cc1c4dd5569f4e1124029031a5408: Status 404 returned error can't find the container with id b905d8557fa077c1f997763c6a23d4cba42cc1c4dd5569f4e1124029031a5408 Dec 02 22:54:27 crc kubenswrapper[4696]: I1202 22:54:27.698831 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" event={"ID":"9ee2e5e1-ad58-448a-973e-2207d5cde11b","Type":"ContainerStarted","Data":"a4f19dbb550f79b9a4b6059032a68e2c057acf7387ea8f580a806c5c2ef378f7"} Dec 02 22:54:27 crc kubenswrapper[4696]: I1202 22:54:27.702407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" event={"ID":"20a6ffb2-7272-4be4-9ed2-ba78389166d6","Type":"ContainerStarted","Data":"b905d8557fa077c1f997763c6a23d4cba42cc1c4dd5569f4e1124029031a5408"} Dec 02 22:54:27 crc kubenswrapper[4696]: I1202 22:54:27.712905 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" event={"ID":"e25bae4c-bb72-4fe7-8f1b-f6e61100727c","Type":"ContainerStarted","Data":"6a25147dd9cbf7d6709a7fab92685ca2443a9aabb5df81290597a8c7322bb7ec"} Dec 02 22:54:27 crc kubenswrapper[4696]: I1202 22:54:27.716623 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" event={"ID":"6cef468e-8250-42c5-8ae4-75dccc1b10a5","Type":"ContainerStarted","Data":"54777fb2a0d4aec8cc1d6e5addb309f6c9121d3152146aaa87cacc7f2076b344"} Dec 02 22:54:43 crc kubenswrapper[4696]: E1202 22:54:43.178148 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 02 22:54:43 crc kubenswrapper[4696]: E1202 22:54:43.179692 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dd7x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-4rjbb_openshift-operators(ea569052-61d0-4847-90f1-3e085d6a5363): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:54:43 crc kubenswrapper[4696]: E1202 22:54:43.181088 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" podUID="ea569052-61d0-4847-90f1-3e085d6a5363" Dec 02 22:54:43 crc kubenswrapper[4696]: E1202 22:54:43.871253 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" podUID="ea569052-61d0-4847-90f1-3e085d6a5363" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.437313 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.438114 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s596b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-rc522_openshift-operators(9ee2e5e1-ad58-448a-973e-2207d5cde11b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.440177 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" podUID="9ee2e5e1-ad58-448a-973e-2207d5cde11b" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.739911 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.740092 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-bfc855f8c-28cln_openshift-operators(e25bae4c-bb72-4fe7-8f1b-f6e61100727c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.742091 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" podUID="e25bae4c-bb72-4fe7-8f1b-f6e61100727c" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.752922 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.753105 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn_openshift-operators(6cef468e-8250-42c5-8ae4-75dccc1b10a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.754351 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" podUID="6cef468e-8250-42c5-8ae4-75dccc1b10a5" Dec 02 22:54:45 crc kubenswrapper[4696]: I1202 22:54:45.880539 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" event={"ID":"20a6ffb2-7272-4be4-9ed2-ba78389166d6","Type":"ContainerStarted","Data":"422d8f75128b65e5b46db0a5c7da617deddda36c4562a3d2ff574fb99a112cfc"} Dec 02 22:54:45 crc kubenswrapper[4696]: I1202 22:54:45.881442 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.882920 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" podUID="e25bae4c-bb72-4fe7-8f1b-f6e61100727c" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.883684 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" podUID="6cef468e-8250-42c5-8ae4-75dccc1b10a5" Dec 02 22:54:45 crc kubenswrapper[4696]: E1202 22:54:45.883763 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" podUID="9ee2e5e1-ad58-448a-973e-2207d5cde11b" Dec 02 22:54:45 crc kubenswrapper[4696]: I1202 22:54:45.969869 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" podStartSLOduration=1.210898659 podStartE2EDuration="19.969850431s" podCreationTimestamp="2025-12-02 22:54:26 +0000 UTC" firstStartedPulling="2025-12-02 22:54:26.975374068 +0000 UTC m=+729.856054069" lastFinishedPulling="2025-12-02 22:54:45.73432584 +0000 UTC m=+748.615005841" observedRunningTime="2025-12-02 22:54:45.968913894 +0000 UTC m=+748.849593905" watchObservedRunningTime="2025-12-02 22:54:45.969850431 +0000 UTC m=+748.850530432" Dec 02 22:54:56 crc kubenswrapper[4696]: I1202 22:54:56.606659 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-t8wvk" Dec 02 22:54:58 crc kubenswrapper[4696]: I1202 22:54:58.976922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" event={"ID":"ea569052-61d0-4847-90f1-3e085d6a5363","Type":"ContainerStarted","Data":"e951a8372fb18186a051027f78a1c43a75bf4386094bdbc6b1aadf6c4f123ad2"} Dec 02 22:54:59 crc kubenswrapper[4696]: I1202 22:54:59.004084 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4rjbb" podStartSLOduration=2.237219895 podStartE2EDuration="34.004063636s" podCreationTimestamp="2025-12-02 22:54:25 +0000 UTC" firstStartedPulling="2025-12-02 22:54:26.441680793 +0000 UTC m=+729.322360784" lastFinishedPulling="2025-12-02 22:54:58.208524524 +0000 UTC m=+761.089204525" observedRunningTime="2025-12-02 22:54:59.001430731 +0000 UTC m=+761.882110742" watchObservedRunningTime="2025-12-02 22:54:59.004063636 +0000 UTC m=+761.884743637" Dec 02 22:54:59 crc kubenswrapper[4696]: I1202 22:54:59.985233 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" event={"ID":"9ee2e5e1-ad58-448a-973e-2207d5cde11b","Type":"ContainerStarted","Data":"166862c64c46944f722b6c2260db57c57f2914f01ce607ff222e3fa95a153012"} Dec 02 22:54:59 crc kubenswrapper[4696]: I1202 22:54:59.987225 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:55:00 crc kubenswrapper[4696]: I1202 22:55:00.013875 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" podStartSLOduration=1.5605977439999998 podStartE2EDuration="34.013853347s" podCreationTimestamp="2025-12-02 22:54:26 +0000 UTC" firstStartedPulling="2025-12-02 22:54:26.825956816 +0000 UTC m=+729.706636827" lastFinishedPulling="2025-12-02 22:54:59.279212429 +0000 UTC m=+762.159892430" observedRunningTime="2025-12-02 22:55:00.009630158 +0000 UTC m=+762.890310159" watchObservedRunningTime="2025-12-02 22:55:00.013853347 +0000 UTC m=+762.894533348" Dec 02 22:55:00 crc kubenswrapper[4696]: I1202 22:55:00.178997 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-rc522" Dec 02 22:55:02 crc kubenswrapper[4696]: I1202 22:55:02.000559 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" event={"ID":"6cef468e-8250-42c5-8ae4-75dccc1b10a5","Type":"ContainerStarted","Data":"4ee65767639d491132f54945a5c8f88f051b05346ac1da9708fa173c38bfe8de"} Dec 02 22:55:02 crc kubenswrapper[4696]: I1202 22:55:02.004960 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" event={"ID":"e25bae4c-bb72-4fe7-8f1b-f6e61100727c","Type":"ContainerStarted","Data":"1c338646d734a67f9f0cdf5c9896bfa5504a79c7c02d41748b043a69ae7ee49f"} Dec 02 22:55:02 crc kubenswrapper[4696]: I1202 22:55:02.035211 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn" podStartSLOduration=2.366866037 podStartE2EDuration="37.035172849s" podCreationTimestamp="2025-12-02 22:54:25 +0000 UTC" firstStartedPulling="2025-12-02 22:54:26.73500615 +0000 UTC m=+729.615686151" lastFinishedPulling="2025-12-02 22:55:01.403312962 +0000 UTC m=+764.283992963" observedRunningTime="2025-12-02 22:55:02.031518485 +0000 UTC m=+764.912198486" watchObservedRunningTime="2025-12-02 22:55:02.035172849 +0000 UTC m=+764.915852900" Dec 02 22:55:02 crc kubenswrapper[4696]: I1202 22:55:02.063080 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bfc855f8c-28cln" podStartSLOduration=2.469889705 podStartE2EDuration="37.063016968s" podCreationTimestamp="2025-12-02 22:54:25 +0000 UTC" firstStartedPulling="2025-12-02 22:54:26.759426782 +0000 UTC m=+729.640106783" lastFinishedPulling="2025-12-02 22:55:01.352554045 +0000 UTC m=+764.233234046" observedRunningTime="2025-12-02 22:55:02.060809215 +0000 UTC m=+764.941489226" watchObservedRunningTime="2025-12-02 22:55:02.063016968 +0000 UTC m=+764.943696989" Dec 02 22:55:05 crc kubenswrapper[4696]: I1202 22:55:05.612958 4696 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.525221 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559"] Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.527339 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.531435 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.540276 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559"] Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.620108 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.620181 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r78kj\" (UniqueName: \"kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.620226 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.721869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.721938 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r78kj\" (UniqueName: \"kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.721995 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.722564 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.722573 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.749052 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r78kj\" (UniqueName: \"kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:20 crc kubenswrapper[4696]: I1202 22:55:20.846386 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:21 crc kubenswrapper[4696]: I1202 22:55:21.125860 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559"] Dec 02 22:55:21 crc kubenswrapper[4696]: W1202 22:55:21.135826 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7fcd36_0d45_4703_89ab_df95e3ff5804.slice/crio-1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca WatchSource:0}: Error finding container 1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca: Status 404 returned error can't find the container with id 1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.146444 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" event={"ID":"5c7fcd36-0d45-4703-89ab-df95e3ff5804","Type":"ContainerStarted","Data":"1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca"} Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.858882 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.861091 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.877376 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.957976 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr5n7\" (UniqueName: \"kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.958354 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.958502 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.974056 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:55:22 crc kubenswrapper[4696]: I1202 22:55:22.974150 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.059851 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr5n7\" (UniqueName: \"kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.059935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.059962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.060573 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.060655 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.084509 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr5n7\" (UniqueName: \"kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7\") pod \"redhat-operators-ghj9t\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.153391 4696 generic.go:334] "Generic (PLEG): container finished" podID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerID="790b392950d96aa80c47a873b2c22eda7609c3f3c36b99ebacc9df67a549fa23" exitCode=0 Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.153461 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" event={"ID":"5c7fcd36-0d45-4703-89ab-df95e3ff5804","Type":"ContainerDied","Data":"790b392950d96aa80c47a873b2c22eda7609c3f3c36b99ebacc9df67a549fa23"} Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.195799 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:23 crc kubenswrapper[4696]: I1202 22:55:23.476974 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:24 crc kubenswrapper[4696]: I1202 22:55:24.162194 4696 generic.go:334] "Generic (PLEG): container finished" podID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerID="502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365" exitCode=0 Dec 02 22:55:24 crc kubenswrapper[4696]: I1202 22:55:24.162256 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerDied","Data":"502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365"} Dec 02 22:55:24 crc kubenswrapper[4696]: I1202 22:55:24.162292 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerStarted","Data":"6f6b822f18d2ef2d818c5c86ca6bd1bcfce1f44bb3663475490629ba6bc56b86"} Dec 02 22:55:25 crc kubenswrapper[4696]: I1202 22:55:25.169366 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerStarted","Data":"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5"} Dec 02 22:55:25 crc kubenswrapper[4696]: I1202 22:55:25.171932 4696 generic.go:334] "Generic (PLEG): container finished" podID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerID="89334401016b5b80688acddf5ed3c28554d5010944098f0d1e767e9a67ec89b7" exitCode=0 Dec 02 22:55:25 crc kubenswrapper[4696]: I1202 22:55:25.171981 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" event={"ID":"5c7fcd36-0d45-4703-89ab-df95e3ff5804","Type":"ContainerDied","Data":"89334401016b5b80688acddf5ed3c28554d5010944098f0d1e767e9a67ec89b7"} Dec 02 22:55:26 crc kubenswrapper[4696]: I1202 22:55:26.179448 4696 generic.go:334] "Generic (PLEG): container finished" podID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerID="c33cdb88cf5841c5910e275152cbbe0b6e1c80e72b21ed4b56fda0f68600e0ce" exitCode=0 Dec 02 22:55:26 crc kubenswrapper[4696]: I1202 22:55:26.179561 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" event={"ID":"5c7fcd36-0d45-4703-89ab-df95e3ff5804","Type":"ContainerDied","Data":"c33cdb88cf5841c5910e275152cbbe0b6e1c80e72b21ed4b56fda0f68600e0ce"} Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.190108 4696 generic.go:334] "Generic (PLEG): container finished" podID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerID="626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5" exitCode=0 Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.190254 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerDied","Data":"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5"} Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.625181 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.725125 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle\") pod \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.725247 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r78kj\" (UniqueName: \"kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj\") pod \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.725367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util\") pod \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\" (UID: \"5c7fcd36-0d45-4703-89ab-df95e3ff5804\") " Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.726337 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle" (OuterVolumeSpecName: "bundle") pod "5c7fcd36-0d45-4703-89ab-df95e3ff5804" (UID: "5c7fcd36-0d45-4703-89ab-df95e3ff5804"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.731258 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj" (OuterVolumeSpecName: "kube-api-access-r78kj") pod "5c7fcd36-0d45-4703-89ab-df95e3ff5804" (UID: "5c7fcd36-0d45-4703-89ab-df95e3ff5804"). InnerVolumeSpecName "kube-api-access-r78kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.827289 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.827319 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r78kj\" (UniqueName: \"kubernetes.io/projected/5c7fcd36-0d45-4703-89ab-df95e3ff5804-kube-api-access-r78kj\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.896176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util" (OuterVolumeSpecName: "util") pod "5c7fcd36-0d45-4703-89ab-df95e3ff5804" (UID: "5c7fcd36-0d45-4703-89ab-df95e3ff5804"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:55:27 crc kubenswrapper[4696]: I1202 22:55:27.928396 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7fcd36-0d45-4703-89ab-df95e3ff5804-util\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:28 crc kubenswrapper[4696]: I1202 22:55:28.199000 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerStarted","Data":"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca"} Dec 02 22:55:28 crc kubenswrapper[4696]: I1202 22:55:28.204372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" event={"ID":"5c7fcd36-0d45-4703-89ab-df95e3ff5804","Type":"ContainerDied","Data":"1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca"} Dec 02 22:55:28 crc kubenswrapper[4696]: I1202 22:55:28.204421 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d86b2ccd1f441bfada20fc16a416835d85e0d7d657e47b486b8d396ff4648ca" Dec 02 22:55:28 crc kubenswrapper[4696]: I1202 22:55:28.204489 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559" Dec 02 22:55:28 crc kubenswrapper[4696]: I1202 22:55:28.223876 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ghj9t" podStartSLOduration=2.643046488 podStartE2EDuration="6.223854532s" podCreationTimestamp="2025-12-02 22:55:22 +0000 UTC" firstStartedPulling="2025-12-02 22:55:24.164064993 +0000 UTC m=+787.044744994" lastFinishedPulling="2025-12-02 22:55:27.744873037 +0000 UTC m=+790.625553038" observedRunningTime="2025-12-02 22:55:28.218118622 +0000 UTC m=+791.098798623" watchObservedRunningTime="2025-12-02 22:55:28.223854532 +0000 UTC m=+791.104534533" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.580489 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d"] Dec 02 22:55:29 crc kubenswrapper[4696]: E1202 22:55:29.581870 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="util" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.581974 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="util" Dec 02 22:55:29 crc kubenswrapper[4696]: E1202 22:55:29.582049 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="pull" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.582117 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="pull" Dec 02 22:55:29 crc kubenswrapper[4696]: E1202 22:55:29.582192 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="extract" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.582260 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="extract" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.582477 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7fcd36-0d45-4703-89ab-df95e3ff5804" containerName="extract" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.583695 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.586068 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5p9cl" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.586462 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.586620 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.600567 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d"] Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.651755 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbzz\" (UniqueName: \"kubernetes.io/projected/98262b9a-2be3-48d1-becc-84c3e9585c46-kube-api-access-cmbzz\") pod \"nmstate-operator-5b5b58f5c8-cts9d\" (UID: \"98262b9a-2be3-48d1-becc-84c3e9585c46\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.752962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbzz\" (UniqueName: \"kubernetes.io/projected/98262b9a-2be3-48d1-becc-84c3e9585c46-kube-api-access-cmbzz\") pod \"nmstate-operator-5b5b58f5c8-cts9d\" (UID: \"98262b9a-2be3-48d1-becc-84c3e9585c46\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.778122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbzz\" (UniqueName: \"kubernetes.io/projected/98262b9a-2be3-48d1-becc-84c3e9585c46-kube-api-access-cmbzz\") pod \"nmstate-operator-5b5b58f5c8-cts9d\" (UID: \"98262b9a-2be3-48d1-becc-84c3e9585c46\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" Dec 02 22:55:29 crc kubenswrapper[4696]: I1202 22:55:29.905469 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" Dec 02 22:55:30 crc kubenswrapper[4696]: I1202 22:55:30.463691 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d"] Dec 02 22:55:31 crc kubenswrapper[4696]: I1202 22:55:31.223263 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" event={"ID":"98262b9a-2be3-48d1-becc-84c3e9585c46","Type":"ContainerStarted","Data":"67beab4aa25d43b5a232bc06f43e534738b451c74c3d3af78e83093111375dc4"} Dec 02 22:55:33 crc kubenswrapper[4696]: I1202 22:55:33.196091 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:33 crc kubenswrapper[4696]: I1202 22:55:33.196639 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:34 crc kubenswrapper[4696]: I1202 22:55:34.243275 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" event={"ID":"98262b9a-2be3-48d1-becc-84c3e9585c46","Type":"ContainerStarted","Data":"db76deeb84faf19226484f466734f8fa1709b20e98a75b3202bab1aa6ac306fd"} Dec 02 22:55:34 crc kubenswrapper[4696]: I1202 22:55:34.264917 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-cts9d" podStartSLOduration=2.110885669 podStartE2EDuration="5.264895654s" podCreationTimestamp="2025-12-02 22:55:29 +0000 UTC" firstStartedPulling="2025-12-02 22:55:30.473187105 +0000 UTC m=+793.353867096" lastFinishedPulling="2025-12-02 22:55:33.62719707 +0000 UTC m=+796.507877081" observedRunningTime="2025-12-02 22:55:34.262267731 +0000 UTC m=+797.142947742" watchObservedRunningTime="2025-12-02 22:55:34.264895654 +0000 UTC m=+797.145575655" Dec 02 22:55:34 crc kubenswrapper[4696]: I1202 22:55:34.264941 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ghj9t" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="registry-server" probeResult="failure" output=< Dec 02 22:55:34 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 22:55:34 crc kubenswrapper[4696]: > Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.348120 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.349277 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.352405 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qvx8c" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.359436 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.360375 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.363641 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.376829 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.383852 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.396419 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8mz82"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.398157 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.501613 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.502847 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.504825 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.505128 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.505373 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ht6kl" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.519311 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.527449 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-ovs-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.527826 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-dbus-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.527902 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-nmstate-lock\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.528005 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpkt\" (UniqueName: \"kubernetes.io/projected/0301b6ea-801b-41a5-b96a-018412c37fc8-kube-api-access-sxpkt\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.528084 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0301b6ea-801b-41a5-b96a-018412c37fc8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.528156 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcw5d\" (UniqueName: \"kubernetes.io/projected/52790af0-09aa-4b8f-8350-054135e80896-kube-api-access-pcw5d\") pod \"nmstate-metrics-7f946cbc9-lcb5l\" (UID: \"52790af0-09aa-4b8f-8350-054135e80896\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.528218 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkxd7\" (UniqueName: \"kubernetes.io/projected/e69b657f-75dd-418a-80f8-1e3820f1ff88-kube-api-access-nkxd7\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630025 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdl8\" (UniqueName: \"kubernetes.io/projected/16ae587c-763d-46f6-b211-e9b3752339c9-kube-api-access-bqdl8\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/16ae587c-763d-46f6-b211-e9b3752339c9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630496 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-dbus-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630569 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-nmstate-lock\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630673 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-nmstate-lock\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpkt\" (UniqueName: \"kubernetes.io/projected/0301b6ea-801b-41a5-b96a-018412c37fc8-kube-api-access-sxpkt\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630866 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0301b6ea-801b-41a5-b96a-018412c37fc8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630947 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcw5d\" (UniqueName: \"kubernetes.io/projected/52790af0-09aa-4b8f-8350-054135e80896-kube-api-access-pcw5d\") pod \"nmstate-metrics-7f946cbc9-lcb5l\" (UID: \"52790af0-09aa-4b8f-8350-054135e80896\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.630996 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkxd7\" (UniqueName: \"kubernetes.io/projected/e69b657f-75dd-418a-80f8-1e3820f1ff88-kube-api-access-nkxd7\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.631043 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-ovs-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.631094 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/16ae587c-763d-46f6-b211-e9b3752339c9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.631029 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-dbus-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.631255 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e69b657f-75dd-418a-80f8-1e3820f1ff88-ovs-socket\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.638793 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0301b6ea-801b-41a5-b96a-018412c37fc8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.652027 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpkt\" (UniqueName: \"kubernetes.io/projected/0301b6ea-801b-41a5-b96a-018412c37fc8-kube-api-access-sxpkt\") pod \"nmstate-webhook-5f6d4c5ccb-srjgk\" (UID: \"0301b6ea-801b-41a5-b96a-018412c37fc8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.655040 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkxd7\" (UniqueName: \"kubernetes.io/projected/e69b657f-75dd-418a-80f8-1e3820f1ff88-kube-api-access-nkxd7\") pod \"nmstate-handler-8mz82\" (UID: \"e69b657f-75dd-418a-80f8-1e3820f1ff88\") " pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.671518 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcw5d\" (UniqueName: \"kubernetes.io/projected/52790af0-09aa-4b8f-8350-054135e80896-kube-api-access-pcw5d\") pod \"nmstate-metrics-7f946cbc9-lcb5l\" (UID: \"52790af0-09aa-4b8f-8350-054135e80896\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.677392 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.710583 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-586ccddd-nv8zx"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.720288 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.721296 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.732005 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/16ae587c-763d-46f6-b211-e9b3752339c9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.732306 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586ccddd-nv8zx"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.733110 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/16ae587c-763d-46f6-b211-e9b3752339c9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.733177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdl8\" (UniqueName: \"kubernetes.io/projected/16ae587c-763d-46f6-b211-e9b3752339c9-kube-api-access-bqdl8\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.733695 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/16ae587c-763d-46f6-b211-e9b3752339c9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.737612 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/16ae587c-763d-46f6-b211-e9b3752339c9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.758388 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdl8\" (UniqueName: \"kubernetes.io/projected/16ae587c-763d-46f6-b211-e9b3752339c9-kube-api-access-bqdl8\") pod \"nmstate-console-plugin-7fbb5f6569-nl682\" (UID: \"16ae587c-763d-46f6-b211-e9b3752339c9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: W1202 22:55:35.771203 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode69b657f_75dd_418a_80f8_1e3820f1ff88.slice/crio-eb2b4d8f01342134f5581d4f836dde35ca97ff11178528ab6392f0962556669a WatchSource:0}: Error finding container eb2b4d8f01342134f5581d4f836dde35ca97ff11178528ab6392f0962556669a: Status 404 returned error can't find the container with id eb2b4d8f01342134f5581d4f836dde35ca97ff11178528ab6392f0962556669a Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.822384 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.834856 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-oauth-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.834916 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.834945 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-oauth-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.834975 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjtcx\" (UniqueName: \"kubernetes.io/projected/955e3849-3105-4718-bc27-9be326955f76-kube-api-access-sjtcx\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.834997 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-service-ca\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.835035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-trusted-ca-bundle\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.835067 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-console-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-console-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936227 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-oauth-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936259 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936288 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-oauth-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936327 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjtcx\" (UniqueName: \"kubernetes.io/projected/955e3849-3105-4718-bc27-9be326955f76-kube-api-access-sjtcx\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936351 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-service-ca\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.936398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-trusted-ca-bundle\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.938219 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-trusted-ca-bundle\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.939457 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-console-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.941436 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-service-ca\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.942437 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955e3849-3105-4718-bc27-9be326955f76-oauth-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.955099 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk"] Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.959228 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjtcx\" (UniqueName: \"kubernetes.io/projected/955e3849-3105-4718-bc27-9be326955f76-kube-api-access-sjtcx\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.959281 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-serving-cert\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.961627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955e3849-3105-4718-bc27-9be326955f76-console-oauth-config\") pod \"console-586ccddd-nv8zx\" (UID: \"955e3849-3105-4718-bc27-9be326955f76\") " pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:35 crc kubenswrapper[4696]: I1202 22:55:35.967713 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" Dec 02 22:55:36 crc kubenswrapper[4696]: W1202 22:55:36.040441 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ae587c_763d_46f6_b211_e9b3752339c9.slice/crio-ff774a69bc28adcb4782b83b7db4b58fb97ff5a2724dcd173ef0a1714920e057 WatchSource:0}: Error finding container ff774a69bc28adcb4782b83b7db4b58fb97ff5a2724dcd173ef0a1714920e057: Status 404 returned error can't find the container with id ff774a69bc28adcb4782b83b7db4b58fb97ff5a2724dcd173ef0a1714920e057 Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.045453 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682"] Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.075542 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.201983 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l"] Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.260102 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8mz82" event={"ID":"e69b657f-75dd-418a-80f8-1e3820f1ff88","Type":"ContainerStarted","Data":"eb2b4d8f01342134f5581d4f836dde35ca97ff11178528ab6392f0962556669a"} Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.262697 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" event={"ID":"16ae587c-763d-46f6-b211-e9b3752339c9","Type":"ContainerStarted","Data":"ff774a69bc28adcb4782b83b7db4b58fb97ff5a2724dcd173ef0a1714920e057"} Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.263622 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" event={"ID":"52790af0-09aa-4b8f-8350-054135e80896","Type":"ContainerStarted","Data":"5e9ec1821b4cd0c20069010186a65324b0244ff6d4f61ecb321465a48d395539"} Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.264616 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" event={"ID":"0301b6ea-801b-41a5-b96a-018412c37fc8","Type":"ContainerStarted","Data":"9a1a926741822cd89bd6c8648847d52612e222befdea0716361c27cd61e9a72c"} Dec 02 22:55:36 crc kubenswrapper[4696]: I1202 22:55:36.529127 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586ccddd-nv8zx"] Dec 02 22:55:36 crc kubenswrapper[4696]: W1202 22:55:36.532419 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955e3849_3105_4718_bc27_9be326955f76.slice/crio-1dba4ea74ccd4e42dc7abbd2b12d7c29198bdc274fc79cb4fe03ffd65f55ee16 WatchSource:0}: Error finding container 1dba4ea74ccd4e42dc7abbd2b12d7c29198bdc274fc79cb4fe03ffd65f55ee16: Status 404 returned error can't find the container with id 1dba4ea74ccd4e42dc7abbd2b12d7c29198bdc274fc79cb4fe03ffd65f55ee16 Dec 02 22:55:37 crc kubenswrapper[4696]: I1202 22:55:37.281059 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586ccddd-nv8zx" event={"ID":"955e3849-3105-4718-bc27-9be326955f76","Type":"ContainerStarted","Data":"1dba4ea74ccd4e42dc7abbd2b12d7c29198bdc274fc79cb4fe03ffd65f55ee16"} Dec 02 22:55:38 crc kubenswrapper[4696]: I1202 22:55:38.292138 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586ccddd-nv8zx" event={"ID":"955e3849-3105-4718-bc27-9be326955f76","Type":"ContainerStarted","Data":"a044e4dfd85486e5fda03c9fb413691c373348a684be52819cf5600f4daf72c0"} Dec 02 22:55:38 crc kubenswrapper[4696]: I1202 22:55:38.314923 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-586ccddd-nv8zx" podStartSLOduration=3.314895269 podStartE2EDuration="3.314895269s" podCreationTimestamp="2025-12-02 22:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:55:38.310468225 +0000 UTC m=+801.191148226" watchObservedRunningTime="2025-12-02 22:55:38.314895269 +0000 UTC m=+801.195575270" Dec 02 22:55:40 crc kubenswrapper[4696]: I1202 22:55:40.325640 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:55:40 crc kubenswrapper[4696]: I1202 22:55:40.361310 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" podStartSLOduration=1.239086599 podStartE2EDuration="5.361289409s" podCreationTimestamp="2025-12-02 22:55:35 +0000 UTC" firstStartedPulling="2025-12-02 22:55:35.966713709 +0000 UTC m=+798.847393710" lastFinishedPulling="2025-12-02 22:55:40.088916509 +0000 UTC m=+802.969596520" observedRunningTime="2025-12-02 22:55:40.361115494 +0000 UTC m=+803.241795505" watchObservedRunningTime="2025-12-02 22:55:40.361289409 +0000 UTC m=+803.241969420" Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.337356 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" event={"ID":"52790af0-09aa-4b8f-8350-054135e80896","Type":"ContainerStarted","Data":"43217cca763cb1f68ea79ee427207978482101c316cd51dee3e52687589be323"} Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.339496 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" event={"ID":"0301b6ea-801b-41a5-b96a-018412c37fc8","Type":"ContainerStarted","Data":"ca2ee5fa180959b24f65b8493b61d6731428156d41665f177d710b62225e2318"} Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.342824 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8mz82" event={"ID":"e69b657f-75dd-418a-80f8-1e3820f1ff88","Type":"ContainerStarted","Data":"7950f95a60b52c375c72f4fd863a40b645175b65edcea810423f0e77a85f34d9"} Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.342900 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.345262 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" event={"ID":"16ae587c-763d-46f6-b211-e9b3752339c9","Type":"ContainerStarted","Data":"5bf84572953d6862bcf44a409944ba9272987e78212dd7ac9aefe7e0acb7afa0"} Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.369877 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8mz82" podStartSLOduration=2.051096256 podStartE2EDuration="6.369856731s" podCreationTimestamp="2025-12-02 22:55:35 +0000 UTC" firstStartedPulling="2025-12-02 22:55:35.773267591 +0000 UTC m=+798.653947592" lastFinishedPulling="2025-12-02 22:55:40.092028036 +0000 UTC m=+802.972708067" observedRunningTime="2025-12-02 22:55:41.369149641 +0000 UTC m=+804.249829682" watchObservedRunningTime="2025-12-02 22:55:41.369856731 +0000 UTC m=+804.250536732" Dec 02 22:55:41 crc kubenswrapper[4696]: I1202 22:55:41.400450 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-nl682" podStartSLOduration=2.362017032 podStartE2EDuration="6.400413264s" podCreationTimestamp="2025-12-02 22:55:35 +0000 UTC" firstStartedPulling="2025-12-02 22:55:36.04308481 +0000 UTC m=+798.923764811" lastFinishedPulling="2025-12-02 22:55:40.081481032 +0000 UTC m=+802.962161043" observedRunningTime="2025-12-02 22:55:41.3880879 +0000 UTC m=+804.268767911" watchObservedRunningTime="2025-12-02 22:55:41.400413264 +0000 UTC m=+804.281093305" Dec 02 22:55:43 crc kubenswrapper[4696]: I1202 22:55:43.289807 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:43 crc kubenswrapper[4696]: I1202 22:55:43.364400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" event={"ID":"52790af0-09aa-4b8f-8350-054135e80896","Type":"ContainerStarted","Data":"27fc803adabd1be182c4a98e7550d54507e5a30703dfba8ea1d44d7190c737ab"} Dec 02 22:55:43 crc kubenswrapper[4696]: I1202 22:55:43.382217 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:43 crc kubenswrapper[4696]: I1202 22:55:43.388193 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lcb5l" podStartSLOduration=2.11502316 podStartE2EDuration="8.388174787s" podCreationTimestamp="2025-12-02 22:55:35 +0000 UTC" firstStartedPulling="2025-12-02 22:55:36.2247993 +0000 UTC m=+799.105479301" lastFinishedPulling="2025-12-02 22:55:42.497950917 +0000 UTC m=+805.378630928" observedRunningTime="2025-12-02 22:55:43.386870041 +0000 UTC m=+806.267550052" watchObservedRunningTime="2025-12-02 22:55:43.388174787 +0000 UTC m=+806.268854788" Dec 02 22:55:43 crc kubenswrapper[4696]: I1202 22:55:43.539819 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:44 crc kubenswrapper[4696]: I1202 22:55:44.374584 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ghj9t" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="registry-server" containerID="cri-o://7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca" gracePeriod=2 Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.341078 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.385586 4696 generic.go:334] "Generic (PLEG): container finished" podID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerID="7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca" exitCode=0 Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.385660 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerDied","Data":"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca"} Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.385712 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghj9t" event={"ID":"adbd550b-14c1-4011-8e76-2cab0258cb28","Type":"ContainerDied","Data":"6f6b822f18d2ef2d818c5c86ca6bd1bcfce1f44bb3663475490629ba6bc56b86"} Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.385708 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghj9t" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.385734 4696 scope.go:117] "RemoveContainer" containerID="7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.408428 4696 scope.go:117] "RemoveContainer" containerID="626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.428724 4696 scope.go:117] "RemoveContainer" containerID="502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.453840 4696 scope.go:117] "RemoveContainer" containerID="7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca" Dec 02 22:55:45 crc kubenswrapper[4696]: E1202 22:55:45.454394 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca\": container with ID starting with 7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca not found: ID does not exist" containerID="7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.454446 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca"} err="failed to get container status \"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca\": rpc error: code = NotFound desc = could not find container \"7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca\": container with ID starting with 7f504aa8037cdf85d80288686958fa0292ee2ef336610528829076b736d844ca not found: ID does not exist" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.454477 4696 scope.go:117] "RemoveContainer" containerID="626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5" Dec 02 22:55:45 crc kubenswrapper[4696]: E1202 22:55:45.454946 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5\": container with ID starting with 626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5 not found: ID does not exist" containerID="626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.454986 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5"} err="failed to get container status \"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5\": rpc error: code = NotFound desc = could not find container \"626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5\": container with ID starting with 626cf22adf1dbddd173201724f4b648753eb457ea5319380c21a2dded2a550b5 not found: ID does not exist" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.455015 4696 scope.go:117] "RemoveContainer" containerID="502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365" Dec 02 22:55:45 crc kubenswrapper[4696]: E1202 22:55:45.455342 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365\": container with ID starting with 502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365 not found: ID does not exist" containerID="502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.455386 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365"} err="failed to get container status \"502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365\": rpc error: code = NotFound desc = could not find container \"502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365\": container with ID starting with 502f7d3ca63a630b481df722b133c2998abe56cd8c0d6090fafa29e40e3d8365 not found: ID does not exist" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.498777 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities\") pod \"adbd550b-14c1-4011-8e76-2cab0258cb28\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.499062 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content\") pod \"adbd550b-14c1-4011-8e76-2cab0258cb28\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.499155 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr5n7\" (UniqueName: \"kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7\") pod \"adbd550b-14c1-4011-8e76-2cab0258cb28\" (UID: \"adbd550b-14c1-4011-8e76-2cab0258cb28\") " Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.500282 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities" (OuterVolumeSpecName: "utilities") pod "adbd550b-14c1-4011-8e76-2cab0258cb28" (UID: "adbd550b-14c1-4011-8e76-2cab0258cb28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.506508 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7" (OuterVolumeSpecName: "kube-api-access-pr5n7") pod "adbd550b-14c1-4011-8e76-2cab0258cb28" (UID: "adbd550b-14c1-4011-8e76-2cab0258cb28"). InnerVolumeSpecName "kube-api-access-pr5n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.601595 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr5n7\" (UniqueName: \"kubernetes.io/projected/adbd550b-14c1-4011-8e76-2cab0258cb28-kube-api-access-pr5n7\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.602011 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.635021 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adbd550b-14c1-4011-8e76-2cab0258cb28" (UID: "adbd550b-14c1-4011-8e76-2cab0258cb28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.703446 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbd550b-14c1-4011-8e76-2cab0258cb28-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.743686 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.753815 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ghj9t"] Dec 02 22:55:45 crc kubenswrapper[4696]: I1202 22:55:45.764913 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8mz82" Dec 02 22:55:46 crc kubenswrapper[4696]: I1202 22:55:46.076876 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:46 crc kubenswrapper[4696]: I1202 22:55:46.076963 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:46 crc kubenswrapper[4696]: I1202 22:55:46.087342 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:46 crc kubenswrapper[4696]: I1202 22:55:46.398166 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-586ccddd-nv8zx" Dec 02 22:55:46 crc kubenswrapper[4696]: I1202 22:55:46.451981 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:55:47 crc kubenswrapper[4696]: I1202 22:55:47.443818 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" path="/var/lib/kubelet/pods/adbd550b-14c1-4011-8e76-2cab0258cb28/volumes" Dec 02 22:55:52 crc kubenswrapper[4696]: I1202 22:55:52.974395 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:55:52 crc kubenswrapper[4696]: I1202 22:55:52.975249 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:55:55 crc kubenswrapper[4696]: I1202 22:55:55.686385 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-srjgk" Dec 02 22:56:11 crc kubenswrapper[4696]: I1202 22:56:11.503307 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-f6wj6" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" containerName="console" containerID="cri-o://f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689" gracePeriod=15 Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.093792 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2"] Dec 02 22:56:12 crc kubenswrapper[4696]: E1202 22:56:12.094105 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="extract-utilities" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.094122 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="extract-utilities" Dec 02 22:56:12 crc kubenswrapper[4696]: E1202 22:56:12.094143 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="registry-server" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.094154 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="registry-server" Dec 02 22:56:12 crc kubenswrapper[4696]: E1202 22:56:12.094170 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="extract-content" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.094180 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="extract-content" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.094334 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbd550b-14c1-4011-8e76-2cab0258cb28" containerName="registry-server" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.095448 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.100091 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.111766 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2"] Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.228103 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.228202 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.228230 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbht9\" (UniqueName: \"kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.330406 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.330478 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbht9\" (UniqueName: \"kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.330519 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.331468 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.331548 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.355086 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbht9\" (UniqueName: \"kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.397834 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f6wj6_cab80860-b375-43ce-9df7-16ed59a8247a/console/0.log" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.397915 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.415014 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.532615 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533017 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533049 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533078 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533117 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533156 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.533213 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmhvw\" (UniqueName: \"kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw\") pod \"cab80860-b375-43ce-9df7-16ed59a8247a\" (UID: \"cab80860-b375-43ce-9df7-16ed59a8247a\") " Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.534481 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca" (OuterVolumeSpecName: "service-ca") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.534498 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config" (OuterVolumeSpecName: "console-config") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.534524 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.534615 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.537421 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.537694 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw" (OuterVolumeSpecName: "kube-api-access-hmhvw") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "kube-api-access-hmhvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.537887 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cab80860-b375-43ce-9df7-16ed59a8247a" (UID: "cab80860-b375-43ce-9df7-16ed59a8247a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619126 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f6wj6_cab80860-b375-43ce-9df7-16ed59a8247a/console/0.log" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619200 4696 generic.go:334] "Generic (PLEG): container finished" podID="cab80860-b375-43ce-9df7-16ed59a8247a" containerID="f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689" exitCode=2 Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619248 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f6wj6" event={"ID":"cab80860-b375-43ce-9df7-16ed59a8247a","Type":"ContainerDied","Data":"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689"} Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f6wj6" event={"ID":"cab80860-b375-43ce-9df7-16ed59a8247a","Type":"ContainerDied","Data":"7672f2c34177fee4a6f5508e91fcd14d12c64a1d94145d6f411daccf3c4cdc74"} Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619334 4696 scope.go:117] "RemoveContainer" containerID="f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.619350 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f6wj6" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.634532 4696 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635069 4696 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cab80860-b375-43ce-9df7-16ed59a8247a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635081 4696 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635093 4696 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635101 4696 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635109 4696 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cab80860-b375-43ce-9df7-16ed59a8247a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.635118 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmhvw\" (UniqueName: \"kubernetes.io/projected/cab80860-b375-43ce-9df7-16ed59a8247a-kube-api-access-hmhvw\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.645776 4696 scope.go:117] "RemoveContainer" containerID="f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689" Dec 02 22:56:12 crc kubenswrapper[4696]: E1202 22:56:12.646280 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689\": container with ID starting with f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689 not found: ID does not exist" containerID="f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.646329 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689"} err="failed to get container status \"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689\": rpc error: code = NotFound desc = could not find container \"f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689\": container with ID starting with f6be4a8b680a81b35c99eb6be900fc50326d748ca3b91a6439e06e2b7f8aa689 not found: ID does not exist" Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.660707 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.670410 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-f6wj6"] Dec 02 22:56:12 crc kubenswrapper[4696]: I1202 22:56:12.854329 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2"] Dec 02 22:56:13 crc kubenswrapper[4696]: I1202 22:56:13.440052 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" path="/var/lib/kubelet/pods/cab80860-b375-43ce-9df7-16ed59a8247a/volumes" Dec 02 22:56:13 crc kubenswrapper[4696]: I1202 22:56:13.629498 4696 generic.go:334] "Generic (PLEG): container finished" podID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerID="e75a79728f759350e5dd712af4175889fbb332426821222aea08314883961d48" exitCode=0 Dec 02 22:56:13 crc kubenswrapper[4696]: I1202 22:56:13.629638 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" event={"ID":"97c440c0-e159-4a54-a3b4-eb53d72ae698","Type":"ContainerDied","Data":"e75a79728f759350e5dd712af4175889fbb332426821222aea08314883961d48"} Dec 02 22:56:13 crc kubenswrapper[4696]: I1202 22:56:13.629715 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" event={"ID":"97c440c0-e159-4a54-a3b4-eb53d72ae698","Type":"ContainerStarted","Data":"d013b687fcd2b201a6815f42ef4d81e1bd84d91c78f601540eb60a0550d61eb1"} Dec 02 22:56:15 crc kubenswrapper[4696]: I1202 22:56:15.645750 4696 generic.go:334] "Generic (PLEG): container finished" podID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerID="6b8d677c857a2887a4b2ef5da6bd15a93ef93af6d2026e065f72f4a06d2a3cda" exitCode=0 Dec 02 22:56:15 crc kubenswrapper[4696]: I1202 22:56:15.645828 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" event={"ID":"97c440c0-e159-4a54-a3b4-eb53d72ae698","Type":"ContainerDied","Data":"6b8d677c857a2887a4b2ef5da6bd15a93ef93af6d2026e065f72f4a06d2a3cda"} Dec 02 22:56:16 crc kubenswrapper[4696]: I1202 22:56:16.658929 4696 generic.go:334] "Generic (PLEG): container finished" podID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerID="874e40865251a295b3d837b7fb698757efc7d0d284992c0fef4986ffdb7e2d7c" exitCode=0 Dec 02 22:56:16 crc kubenswrapper[4696]: I1202 22:56:16.659010 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" event={"ID":"97c440c0-e159-4a54-a3b4-eb53d72ae698","Type":"ContainerDied","Data":"874e40865251a295b3d837b7fb698757efc7d0d284992c0fef4986ffdb7e2d7c"} Dec 02 22:56:17 crc kubenswrapper[4696]: I1202 22:56:17.928165 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.010435 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util\") pod \"97c440c0-e159-4a54-a3b4-eb53d72ae698\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.010498 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbht9\" (UniqueName: \"kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9\") pod \"97c440c0-e159-4a54-a3b4-eb53d72ae698\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.010526 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle\") pod \"97c440c0-e159-4a54-a3b4-eb53d72ae698\" (UID: \"97c440c0-e159-4a54-a3b4-eb53d72ae698\") " Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.013036 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle" (OuterVolumeSpecName: "bundle") pod "97c440c0-e159-4a54-a3b4-eb53d72ae698" (UID: "97c440c0-e159-4a54-a3b4-eb53d72ae698"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.019105 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9" (OuterVolumeSpecName: "kube-api-access-bbht9") pod "97c440c0-e159-4a54-a3b4-eb53d72ae698" (UID: "97c440c0-e159-4a54-a3b4-eb53d72ae698"). InnerVolumeSpecName "kube-api-access-bbht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.027394 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util" (OuterVolumeSpecName: "util") pod "97c440c0-e159-4a54-a3b4-eb53d72ae698" (UID: "97c440c0-e159-4a54-a3b4-eb53d72ae698"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.111526 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-util\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.111947 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbht9\" (UniqueName: \"kubernetes.io/projected/97c440c0-e159-4a54-a3b4-eb53d72ae698-kube-api-access-bbht9\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.112038 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97c440c0-e159-4a54-a3b4-eb53d72ae698-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.673462 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" event={"ID":"97c440c0-e159-4a54-a3b4-eb53d72ae698","Type":"ContainerDied","Data":"d013b687fcd2b201a6815f42ef4d81e1bd84d91c78f601540eb60a0550d61eb1"} Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.673908 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d013b687fcd2b201a6815f42ef4d81e1bd84d91c78f601540eb60a0550d61eb1" Dec 02 22:56:18 crc kubenswrapper[4696]: I1202 22:56:18.673507 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2" Dec 02 22:56:18 crc kubenswrapper[4696]: E1202 22:56:18.795202 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c440c0_e159_4a54_a3b4_eb53d72ae698.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c440c0_e159_4a54_a3b4_eb53d72ae698.slice/crio-d013b687fcd2b201a6815f42ef4d81e1bd84d91c78f601540eb60a0550d61eb1\": RecentStats: unable to find data in memory cache]" Dec 02 22:56:22 crc kubenswrapper[4696]: I1202 22:56:22.974114 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:56:22 crc kubenswrapper[4696]: I1202 22:56:22.974623 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:56:22 crc kubenswrapper[4696]: I1202 22:56:22.974683 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:56:22 crc kubenswrapper[4696]: I1202 22:56:22.975475 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 22:56:22 crc kubenswrapper[4696]: I1202 22:56:22.975548 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976" gracePeriod=600 Dec 02 22:56:23 crc kubenswrapper[4696]: I1202 22:56:23.708117 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976" exitCode=0 Dec 02 22:56:23 crc kubenswrapper[4696]: I1202 22:56:23.708204 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976"} Dec 02 22:56:23 crc kubenswrapper[4696]: I1202 22:56:23.708535 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6"} Dec 02 22:56:23 crc kubenswrapper[4696]: I1202 22:56:23.708561 4696 scope.go:117] "RemoveContainer" containerID="3abfb6374bbfb811db57d1f3b4095d464fff02776e083b36d869e7869f2cbc02" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.351470 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg"] Dec 02 22:56:27 crc kubenswrapper[4696]: E1202 22:56:27.352539 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="pull" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352556 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="pull" Dec 02 22:56:27 crc kubenswrapper[4696]: E1202 22:56:27.352581 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" containerName="console" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352590 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" containerName="console" Dec 02 22:56:27 crc kubenswrapper[4696]: E1202 22:56:27.352599 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="util" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352605 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="util" Dec 02 22:56:27 crc kubenswrapper[4696]: E1202 22:56:27.352615 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="extract" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352621 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="extract" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352715 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c440c0-e159-4a54-a3b4-eb53d72ae698" containerName="extract" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.352731 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab80860-b375-43ce-9df7-16ed59a8247a" containerName="console" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.353227 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.357243 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.357524 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.358134 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.358416 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9j6xh" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.359440 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.367155 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg"] Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.444875 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26cp8\" (UniqueName: \"kubernetes.io/projected/978a6167-34da-4d05-a693-a9f7f4d865b2-kube-api-access-26cp8\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.444994 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-webhook-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.445043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-apiservice-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.546554 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-webhook-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.546649 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-apiservice-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.546715 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26cp8\" (UniqueName: \"kubernetes.io/projected/978a6167-34da-4d05-a693-a9f7f4d865b2-kube-api-access-26cp8\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.555504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-apiservice-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.558317 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/978a6167-34da-4d05-a693-a9f7f4d865b2-webhook-cert\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.564322 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26cp8\" (UniqueName: \"kubernetes.io/projected/978a6167-34da-4d05-a693-a9f7f4d865b2-kube-api-access-26cp8\") pod \"metallb-operator-controller-manager-c67fd5d6c-gjrcg\" (UID: \"978a6167-34da-4d05-a693-a9f7f4d865b2\") " pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.603482 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n"] Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.604598 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.607383 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.607648 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.608058 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fv5l9" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.633124 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n"] Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.670661 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.750046 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-apiservice-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.750569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-webhook-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.750604 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nc4r\" (UniqueName: \"kubernetes.io/projected/4e9c6038-441a-483a-b7e3-ff298010cf18-kube-api-access-8nc4r\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.852460 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc4r\" (UniqueName: \"kubernetes.io/projected/4e9c6038-441a-483a-b7e3-ff298010cf18-kube-api-access-8nc4r\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.852583 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-apiservice-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.852610 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-webhook-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.868693 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-apiservice-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.868846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e9c6038-441a-483a-b7e3-ff298010cf18-webhook-cert\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.874598 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nc4r\" (UniqueName: \"kubernetes.io/projected/4e9c6038-441a-483a-b7e3-ff298010cf18-kube-api-access-8nc4r\") pod \"metallb-operator-webhook-server-6c7867ffbb-nxw6n\" (UID: \"4e9c6038-441a-483a-b7e3-ff298010cf18\") " pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:27 crc kubenswrapper[4696]: I1202 22:56:27.940463 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:28 crc kubenswrapper[4696]: I1202 22:56:28.158370 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg"] Dec 02 22:56:28 crc kubenswrapper[4696]: I1202 22:56:28.395846 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n"] Dec 02 22:56:28 crc kubenswrapper[4696]: W1202 22:56:28.400434 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9c6038_441a_483a_b7e3_ff298010cf18.slice/crio-1d93f342275b8f5b6790ff5f8487271fa4b487eb8f1c213cdc01e3ecbca04907 WatchSource:0}: Error finding container 1d93f342275b8f5b6790ff5f8487271fa4b487eb8f1c213cdc01e3ecbca04907: Status 404 returned error can't find the container with id 1d93f342275b8f5b6790ff5f8487271fa4b487eb8f1c213cdc01e3ecbca04907 Dec 02 22:56:28 crc kubenswrapper[4696]: I1202 22:56:28.748417 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" event={"ID":"4e9c6038-441a-483a-b7e3-ff298010cf18","Type":"ContainerStarted","Data":"1d93f342275b8f5b6790ff5f8487271fa4b487eb8f1c213cdc01e3ecbca04907"} Dec 02 22:56:28 crc kubenswrapper[4696]: I1202 22:56:28.749417 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" event={"ID":"978a6167-34da-4d05-a693-a9f7f4d865b2","Type":"ContainerStarted","Data":"c6c71cfed81c2faf71fdfaadec97f854f3379deca889181d549b8ae520ba602c"} Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.802948 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" event={"ID":"4e9c6038-441a-483a-b7e3-ff298010cf18","Type":"ContainerStarted","Data":"fb3160d6bd4b8e3cc2ca01a9e49a563f80c755da0661195e90a02c0fb3efc314"} Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.804845 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.806398 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" event={"ID":"978a6167-34da-4d05-a693-a9f7f4d865b2","Type":"ContainerStarted","Data":"f85fa80b64f77965e1ee2633d35a42a5069922f0d469a53c0ccf373c4d499d67"} Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.806548 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.845557 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" podStartSLOduration=1.7466323419999998 podStartE2EDuration="6.845531104s" podCreationTimestamp="2025-12-02 22:56:27 +0000 UTC" firstStartedPulling="2025-12-02 22:56:28.16668734 +0000 UTC m=+851.047367341" lastFinishedPulling="2025-12-02 22:56:33.265586102 +0000 UTC m=+856.146266103" observedRunningTime="2025-12-02 22:56:33.840050341 +0000 UTC m=+856.720730382" watchObservedRunningTime="2025-12-02 22:56:33.845531104 +0000 UTC m=+856.726211145" Dec 02 22:56:33 crc kubenswrapper[4696]: I1202 22:56:33.847789 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" podStartSLOduration=1.950584412 podStartE2EDuration="6.847782127s" podCreationTimestamp="2025-12-02 22:56:27 +0000 UTC" firstStartedPulling="2025-12-02 22:56:28.404246237 +0000 UTC m=+851.284926238" lastFinishedPulling="2025-12-02 22:56:33.301443942 +0000 UTC m=+856.182123953" observedRunningTime="2025-12-02 22:56:33.821768161 +0000 UTC m=+856.702448182" watchObservedRunningTime="2025-12-02 22:56:33.847782127 +0000 UTC m=+856.728462158" Dec 02 22:56:47 crc kubenswrapper[4696]: I1202 22:56:47.957152 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c7867ffbb-nxw6n" Dec 02 22:57:07 crc kubenswrapper[4696]: I1202 22:57:07.674466 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c67fd5d6c-gjrcg" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.518400 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jwtm6"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.521521 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.524666 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-h9mdk" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.525905 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.529469 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.531356 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.532486 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.538245 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.562728 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.639353 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7xqrp"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.641028 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.645401 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.645842 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.646185 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t6r6h" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.648081 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-86pps"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.649287 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.653791 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.658118 4696 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.663408 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-86pps"] Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671166 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95p5n\" (UniqueName: \"kubernetes.io/projected/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-kube-api-access-95p5n\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671219 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-reloader\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671251 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-conf\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671281 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc16ac0a-e284-468e-b6a9-a8b78572ac06-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5sh8\" (UniqueName: \"kubernetes.io/projected/bc16ac0a-e284-468e-b6a9-a8b78572ac06-kube-api-access-w5sh8\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-startup\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671382 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-sockets\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.671401 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772239 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metrics-certs\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772296 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwdg\" (UniqueName: \"kubernetes.io/projected/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-kube-api-access-xnwdg\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5sh8\" (UniqueName: \"kubernetes.io/projected/bc16ac0a-e284-468e-b6a9-a8b78572ac06-kube-api-access-w5sh8\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772348 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metallb-excludel2\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772409 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-startup\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772429 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz47d\" (UniqueName: \"kubernetes.io/projected/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-kube-api-access-rz47d\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-sockets\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772485 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-metrics-certs\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772522 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95p5n\" (UniqueName: \"kubernetes.io/projected/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-kube-api-access-95p5n\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772548 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-reloader\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772568 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-conf\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772596 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc16ac0a-e284-468e-b6a9-a8b78572ac06-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.772613 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-cert\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: E1202 22:57:08.772811 4696 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 02 22:57:08 crc kubenswrapper[4696]: E1202 22:57:08.772932 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs podName:a4a3825b-89ac-43dc-b2cf-6f9df48d98d9 nodeName:}" failed. No retries permitted until 2025-12-02 22:57:09.272893312 +0000 UTC m=+892.153573523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs") pod "frr-k8s-jwtm6" (UID: "a4a3825b-89ac-43dc-b2cf-6f9df48d98d9") : secret "frr-k8s-certs-secret" not found Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.773063 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.773227 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-conf\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.773337 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-reloader\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.773363 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-sockets\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.774369 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-frr-startup\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.780463 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc16ac0a-e284-468e-b6a9-a8b78572ac06-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.793611 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95p5n\" (UniqueName: \"kubernetes.io/projected/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-kube-api-access-95p5n\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.794242 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5sh8\" (UniqueName: \"kubernetes.io/projected/bc16ac0a-e284-468e-b6a9-a8b78572ac06-kube-api-access-w5sh8\") pod \"frr-k8s-webhook-server-7fcb986d4-p5cdf\" (UID: \"bc16ac0a-e284-468e-b6a9-a8b78572ac06\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.861054 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874305 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874397 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz47d\" (UniqueName: \"kubernetes.io/projected/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-kube-api-access-rz47d\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874418 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-metrics-certs\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874490 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-cert\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874519 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metrics-certs\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874543 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwdg\" (UniqueName: \"kubernetes.io/projected/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-kube-api-access-xnwdg\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.874562 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metallb-excludel2\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.875447 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metallb-excludel2\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: E1202 22:57:08.876115 4696 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 22:57:08 crc kubenswrapper[4696]: E1202 22:57:08.876214 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist podName:58e7c36f-4f09-4ae1-99ce-e18c2612b6ec nodeName:}" failed. No retries permitted until 2025-12-02 22:57:09.376189704 +0000 UTC m=+892.256869705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist") pod "speaker-7xqrp" (UID: "58e7c36f-4f09-4ae1-99ce-e18c2612b6ec") : secret "metallb-memberlist" not found Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.878487 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-cert\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.880902 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-metrics-certs\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.881682 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-metrics-certs\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.894768 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz47d\" (UniqueName: \"kubernetes.io/projected/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-kube-api-access-rz47d\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.896901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwdg\" (UniqueName: \"kubernetes.io/projected/e4a5e393-9801-4de5-86b3-ac2cb60bcdae-kube-api-access-xnwdg\") pod \"controller-f8648f98b-86pps\" (UID: \"e4a5e393-9801-4de5-86b3-ac2cb60bcdae\") " pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:08 crc kubenswrapper[4696]: I1202 22:57:08.969295 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.231477 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-86pps"] Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.282961 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.291641 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf"] Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.293826 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4a3825b-89ac-43dc-b2cf-6f9df48d98d9-metrics-certs\") pod \"frr-k8s-jwtm6\" (UID: \"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9\") " pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:09 crc kubenswrapper[4696]: W1202 22:57:09.301505 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc16ac0a_e284_468e_b6a9_a8b78572ac06.slice/crio-5cf924556b61bea482450c0b88f58f9601a73a0896cbc5914dce8417b41018d8 WatchSource:0}: Error finding container 5cf924556b61bea482450c0b88f58f9601a73a0896cbc5914dce8417b41018d8: Status 404 returned error can't find the container with id 5cf924556b61bea482450c0b88f58f9601a73a0896cbc5914dce8417b41018d8 Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.383682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:09 crc kubenswrapper[4696]: E1202 22:57:09.383872 4696 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 22:57:09 crc kubenswrapper[4696]: E1202 22:57:09.383951 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist podName:58e7c36f-4f09-4ae1-99ce-e18c2612b6ec nodeName:}" failed. No retries permitted until 2025-12-02 22:57:10.383932052 +0000 UTC m=+893.264612063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist") pod "speaker-7xqrp" (UID: "58e7c36f-4f09-4ae1-99ce-e18c2612b6ec") : secret "metallb-memberlist" not found Dec 02 22:57:09 crc kubenswrapper[4696]: I1202 22:57:09.448674 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.052953 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"33f39ae92067a0f1d471287fdf932d8d56c40289dc6981b6c9e06d75e16a6383"} Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.055287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-86pps" event={"ID":"e4a5e393-9801-4de5-86b3-ac2cb60bcdae","Type":"ContainerStarted","Data":"1d9ce29cb291214b3a434c61d81699926df4abc8a17ad7ec24ab3bc47694cdb6"} Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.055385 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-86pps" event={"ID":"e4a5e393-9801-4de5-86b3-ac2cb60bcdae","Type":"ContainerStarted","Data":"d17c5a08647d8f2290e32a547e929064e54efd64f678e812d499eb957c3462e7"} Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.055400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-86pps" event={"ID":"e4a5e393-9801-4de5-86b3-ac2cb60bcdae","Type":"ContainerStarted","Data":"08790aaafa794f65f3122d6ae98a4240657537ef8e6bd7f96a7c1939025705bf"} Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.055710 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.059459 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" event={"ID":"bc16ac0a-e284-468e-b6a9-a8b78572ac06","Type":"ContainerStarted","Data":"5cf924556b61bea482450c0b88f58f9601a73a0896cbc5914dce8417b41018d8"} Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.082629 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-86pps" podStartSLOduration=2.082610997 podStartE2EDuration="2.082610997s" podCreationTimestamp="2025-12-02 22:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:57:10.078973865 +0000 UTC m=+892.959653856" watchObservedRunningTime="2025-12-02 22:57:10.082610997 +0000 UTC m=+892.963290998" Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.398988 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.408713 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/58e7c36f-4f09-4ae1-99ce-e18c2612b6ec-memberlist\") pod \"speaker-7xqrp\" (UID: \"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec\") " pod="metallb-system/speaker-7xqrp" Dec 02 22:57:10 crc kubenswrapper[4696]: I1202 22:57:10.459969 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xqrp" Dec 02 22:57:11 crc kubenswrapper[4696]: I1202 22:57:11.071021 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xqrp" event={"ID":"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec","Type":"ContainerStarted","Data":"303a2d02dafc12971cec9c095db4209a6ba88a872cbe23e10158e7232c44b300"} Dec 02 22:57:11 crc kubenswrapper[4696]: I1202 22:57:11.071077 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xqrp" event={"ID":"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec","Type":"ContainerStarted","Data":"6ca57db5579c34a8fb0e331734a521b35da4e03bc3874a8723a3b70cef04149d"} Dec 02 22:57:12 crc kubenswrapper[4696]: I1202 22:57:12.089894 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xqrp" event={"ID":"58e7c36f-4f09-4ae1-99ce-e18c2612b6ec","Type":"ContainerStarted","Data":"45200200947c2e2136b53c7d7cf13b5fd514f4707d38dcb8dffa5b15e67ad27b"} Dec 02 22:57:12 crc kubenswrapper[4696]: I1202 22:57:12.090411 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7xqrp" Dec 02 22:57:12 crc kubenswrapper[4696]: I1202 22:57:12.114384 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7xqrp" podStartSLOduration=4.114363298 podStartE2EDuration="4.114363298s" podCreationTimestamp="2025-12-02 22:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:57:12.108120263 +0000 UTC m=+894.988800264" watchObservedRunningTime="2025-12-02 22:57:12.114363298 +0000 UTC m=+894.995043299" Dec 02 22:57:18 crc kubenswrapper[4696]: I1202 22:57:18.179281 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4a3825b-89ac-43dc-b2cf-6f9df48d98d9" containerID="dd08f719c3ae4f2b20f7bfcdfb20c1fec952b5faa8aa4d85de37018ec22ae2f8" exitCode=0 Dec 02 22:57:18 crc kubenswrapper[4696]: I1202 22:57:18.179485 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerDied","Data":"dd08f719c3ae4f2b20f7bfcdfb20c1fec952b5faa8aa4d85de37018ec22ae2f8"} Dec 02 22:57:18 crc kubenswrapper[4696]: I1202 22:57:18.182843 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" event={"ID":"bc16ac0a-e284-468e-b6a9-a8b78572ac06","Type":"ContainerStarted","Data":"9576b72b4164f0361c1c956ecbb0a1c8a9791b7c1abedfcfca65981d2b3c0a99"} Dec 02 22:57:18 crc kubenswrapper[4696]: I1202 22:57:18.183070 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:18 crc kubenswrapper[4696]: I1202 22:57:18.246445 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" podStartSLOduration=2.153663709 podStartE2EDuration="10.246420729s" podCreationTimestamp="2025-12-02 22:57:08 +0000 UTC" firstStartedPulling="2025-12-02 22:57:09.304184956 +0000 UTC m=+892.184864957" lastFinishedPulling="2025-12-02 22:57:17.396941976 +0000 UTC m=+900.277621977" observedRunningTime="2025-12-02 22:57:18.242077048 +0000 UTC m=+901.122757049" watchObservedRunningTime="2025-12-02 22:57:18.246420729 +0000 UTC m=+901.127100740" Dec 02 22:57:19 crc kubenswrapper[4696]: I1202 22:57:19.191650 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4a3825b-89ac-43dc-b2cf-6f9df48d98d9" containerID="77dcaed44938b27078c093878bc64c1e2570064d0c4c14b0609cf2d939c82b17" exitCode=0 Dec 02 22:57:19 crc kubenswrapper[4696]: I1202 22:57:19.191766 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerDied","Data":"77dcaed44938b27078c093878bc64c1e2570064d0c4c14b0609cf2d939c82b17"} Dec 02 22:57:20 crc kubenswrapper[4696]: I1202 22:57:20.202327 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4a3825b-89ac-43dc-b2cf-6f9df48d98d9" containerID="9321d831fc73ed86e39d4158b459b4a5bbe42b265f7b01c2ed3edd9fbec515a3" exitCode=0 Dec 02 22:57:20 crc kubenswrapper[4696]: I1202 22:57:20.202400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerDied","Data":"9321d831fc73ed86e39d4158b459b4a5bbe42b265f7b01c2ed3edd9fbec515a3"} Dec 02 22:57:20 crc kubenswrapper[4696]: I1202 22:57:20.466726 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7xqrp" Dec 02 22:57:21 crc kubenswrapper[4696]: I1202 22:57:21.219678 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"54ffd466ee53e3fba592e8c36ff9233eaf87a8f7363d3c65e6c05953c3244200"} Dec 02 22:57:21 crc kubenswrapper[4696]: I1202 22:57:21.219968 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"35bdcbac8f24f68973745e691c9eb2ec38da522285ebb0913416cf12ac3ae313"} Dec 02 22:57:21 crc kubenswrapper[4696]: I1202 22:57:21.219988 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"f84f7417204d67409f9ca180832351035d20ddd25978d492a3862742bd278c08"} Dec 02 22:57:21 crc kubenswrapper[4696]: I1202 22:57:21.220005 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"3cc865382633cbf76e2a8a52ad208cbef3d3e197146f171f5d28fb1ac52dbd04"} Dec 02 22:57:21 crc kubenswrapper[4696]: I1202 22:57:21.220018 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"0cec9040c50f98b12274c2051b3369404799d6fcde3c8db26c51eb730af44bf2"} Dec 02 22:57:22 crc kubenswrapper[4696]: I1202 22:57:22.234934 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jwtm6" event={"ID":"a4a3825b-89ac-43dc-b2cf-6f9df48d98d9","Type":"ContainerStarted","Data":"0f26b73af3602413865a45f327e17f00ca8209b4bead136e8860d363bad679cd"} Dec 02 22:57:22 crc kubenswrapper[4696]: I1202 22:57:22.235207 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:22 crc kubenswrapper[4696]: I1202 22:57:22.272045 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jwtm6" podStartSLOduration=6.440607206 podStartE2EDuration="14.272020903s" podCreationTimestamp="2025-12-02 22:57:08 +0000 UTC" firstStartedPulling="2025-12-02 22:57:09.574103678 +0000 UTC m=+892.454783679" lastFinishedPulling="2025-12-02 22:57:17.405517375 +0000 UTC m=+900.286197376" observedRunningTime="2025-12-02 22:57:22.264647477 +0000 UTC m=+905.145327488" watchObservedRunningTime="2025-12-02 22:57:22.272020903 +0000 UTC m=+905.152700914" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.588897 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.591594 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.596202 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vc5lm" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.596582 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.596803 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.609867 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.720233 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5q4g\" (UniqueName: \"kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g\") pod \"openstack-operator-index-xk9p8\" (UID: \"81df406b-b45a-42a5-84c1-9e96c37e2ef4\") " pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.821954 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5q4g\" (UniqueName: \"kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g\") pod \"openstack-operator-index-xk9p8\" (UID: \"81df406b-b45a-42a5-84c1-9e96c37e2ef4\") " pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.842481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5q4g\" (UniqueName: \"kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g\") pod \"openstack-operator-index-xk9p8\" (UID: \"81df406b-b45a-42a5-84c1-9e96c37e2ef4\") " pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:23 crc kubenswrapper[4696]: I1202 22:57:23.920602 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:24 crc kubenswrapper[4696]: I1202 22:57:24.183083 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:57:24 crc kubenswrapper[4696]: I1202 22:57:24.256885 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xk9p8" event={"ID":"81df406b-b45a-42a5-84c1-9e96c37e2ef4","Type":"ContainerStarted","Data":"427aa71280c6fc43c4c021d9f3deed2546b3e9affa56c14e3b3d97bfaa292fe5"} Dec 02 22:57:24 crc kubenswrapper[4696]: I1202 22:57:24.449297 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:24 crc kubenswrapper[4696]: I1202 22:57:24.503677 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:26 crc kubenswrapper[4696]: I1202 22:57:26.942268 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.554262 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k25sp"] Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.556729 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.562217 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k25sp"] Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.698205 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvjf\" (UniqueName: \"kubernetes.io/projected/2bbe83e8-36bc-401e-84b6-917b6aeb6398-kube-api-access-fgvjf\") pod \"openstack-operator-index-k25sp\" (UID: \"2bbe83e8-36bc-401e-84b6-917b6aeb6398\") " pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.799948 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvjf\" (UniqueName: \"kubernetes.io/projected/2bbe83e8-36bc-401e-84b6-917b6aeb6398-kube-api-access-fgvjf\") pod \"openstack-operator-index-k25sp\" (UID: \"2bbe83e8-36bc-401e-84b6-917b6aeb6398\") " pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.830243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvjf\" (UniqueName: \"kubernetes.io/projected/2bbe83e8-36bc-401e-84b6-917b6aeb6398-kube-api-access-fgvjf\") pod \"openstack-operator-index-k25sp\" (UID: \"2bbe83e8-36bc-401e-84b6-917b6aeb6398\") " pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:27 crc kubenswrapper[4696]: I1202 22:57:27.891155 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:28 crc kubenswrapper[4696]: I1202 22:57:28.430441 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k25sp"] Dec 02 22:57:28 crc kubenswrapper[4696]: I1202 22:57:28.870367 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p5cdf" Dec 02 22:57:28 crc kubenswrapper[4696]: I1202 22:57:28.980275 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-86pps" Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.298132 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xk9p8" event={"ID":"81df406b-b45a-42a5-84c1-9e96c37e2ef4","Type":"ContainerStarted","Data":"b19451a0df9878f5ed4ef97525a35d146d7a28d82854d37236243746389e45f9"} Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.298911 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-xk9p8" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" containerName="registry-server" containerID="cri-o://b19451a0df9878f5ed4ef97525a35d146d7a28d82854d37236243746389e45f9" gracePeriod=2 Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.300697 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k25sp" event={"ID":"2bbe83e8-36bc-401e-84b6-917b6aeb6398","Type":"ContainerStarted","Data":"c92fd4c39d7d8f23b1583fd59540e9a1e8faaa2e353c974e046d413d65dcb80f"} Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.300729 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k25sp" event={"ID":"2bbe83e8-36bc-401e-84b6-917b6aeb6398","Type":"ContainerStarted","Data":"e25ebacb645c154ed98943dccf2d67f0c0372c3ed5f8934afb838c29fd707143"} Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.316327 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xk9p8" podStartSLOduration=2.233655234 podStartE2EDuration="6.316294449s" podCreationTimestamp="2025-12-02 22:57:23 +0000 UTC" firstStartedPulling="2025-12-02 22:57:24.185867465 +0000 UTC m=+907.066547466" lastFinishedPulling="2025-12-02 22:57:28.26850668 +0000 UTC m=+911.149186681" observedRunningTime="2025-12-02 22:57:29.316167496 +0000 UTC m=+912.196847497" watchObservedRunningTime="2025-12-02 22:57:29.316294449 +0000 UTC m=+912.196974480" Dec 02 22:57:29 crc kubenswrapper[4696]: I1202 22:57:29.345656 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k25sp" podStartSLOduration=2.271263965 podStartE2EDuration="2.34563504s" podCreationTimestamp="2025-12-02 22:57:27 +0000 UTC" firstStartedPulling="2025-12-02 22:57:28.443635077 +0000 UTC m=+911.324315078" lastFinishedPulling="2025-12-02 22:57:28.518006152 +0000 UTC m=+911.398686153" observedRunningTime="2025-12-02 22:57:29.340590077 +0000 UTC m=+912.221270088" watchObservedRunningTime="2025-12-02 22:57:29.34563504 +0000 UTC m=+912.226315041" Dec 02 22:57:30 crc kubenswrapper[4696]: I1202 22:57:30.310569 4696 generic.go:334] "Generic (PLEG): container finished" podID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" containerID="b19451a0df9878f5ed4ef97525a35d146d7a28d82854d37236243746389e45f9" exitCode=0 Dec 02 22:57:30 crc kubenswrapper[4696]: I1202 22:57:30.310662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xk9p8" event={"ID":"81df406b-b45a-42a5-84c1-9e96c37e2ef4","Type":"ContainerDied","Data":"b19451a0df9878f5ed4ef97525a35d146d7a28d82854d37236243746389e45f9"} Dec 02 22:57:30 crc kubenswrapper[4696]: I1202 22:57:30.919298 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.053217 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5q4g\" (UniqueName: \"kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g\") pod \"81df406b-b45a-42a5-84c1-9e96c37e2ef4\" (UID: \"81df406b-b45a-42a5-84c1-9e96c37e2ef4\") " Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.063516 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g" (OuterVolumeSpecName: "kube-api-access-t5q4g") pod "81df406b-b45a-42a5-84c1-9e96c37e2ef4" (UID: "81df406b-b45a-42a5-84c1-9e96c37e2ef4"). InnerVolumeSpecName "kube-api-access-t5q4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.155691 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5q4g\" (UniqueName: \"kubernetes.io/projected/81df406b-b45a-42a5-84c1-9e96c37e2ef4-kube-api-access-t5q4g\") on node \"crc\" DevicePath \"\"" Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.318828 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xk9p8" event={"ID":"81df406b-b45a-42a5-84c1-9e96c37e2ef4","Type":"ContainerDied","Data":"427aa71280c6fc43c4c021d9f3deed2546b3e9affa56c14e3b3d97bfaa292fe5"} Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.318899 4696 scope.go:117] "RemoveContainer" containerID="b19451a0df9878f5ed4ef97525a35d146d7a28d82854d37236243746389e45f9" Dec 02 22:57:31 crc kubenswrapper[4696]: I1202 22:57:31.318925 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:57:37 crc kubenswrapper[4696]: I1202 22:57:37.891646 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:37 crc kubenswrapper[4696]: I1202 22:57:37.892604 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:37 crc kubenswrapper[4696]: I1202 22:57:37.944242 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:38 crc kubenswrapper[4696]: I1202 22:57:38.442587 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k25sp" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.452929 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jwtm6" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.816973 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc"] Dec 02 22:57:39 crc kubenswrapper[4696]: E1202 22:57:39.817365 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" containerName="registry-server" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.817386 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" containerName="registry-server" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.817629 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" containerName="registry-server" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.819284 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.822394 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-52jfm" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.834427 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc"] Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.998948 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.999100 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7gbp\" (UniqueName: \"kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:39 crc kubenswrapper[4696]: I1202 22:57:39.999248 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.100206 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.100308 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.100373 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7gbp\" (UniqueName: \"kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.101076 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.101308 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.128190 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7gbp\" (UniqueName: \"kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp\") pod \"4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.149770 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:40 crc kubenswrapper[4696]: I1202 22:57:40.595279 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc"] Dec 02 22:57:41 crc kubenswrapper[4696]: I1202 22:57:41.426908 4696 generic.go:334] "Generic (PLEG): container finished" podID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerID="14edda7c4a1b513659d903c8c192d48eac8b7a78dbccd9da46e11076c8c256b9" exitCode=0 Dec 02 22:57:41 crc kubenswrapper[4696]: I1202 22:57:41.426981 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" event={"ID":"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7","Type":"ContainerDied","Data":"14edda7c4a1b513659d903c8c192d48eac8b7a78dbccd9da46e11076c8c256b9"} Dec 02 22:57:41 crc kubenswrapper[4696]: I1202 22:57:41.427025 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" event={"ID":"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7","Type":"ContainerStarted","Data":"4c61cad3bdbcc31dcee6c49fc1cf545a678e714f3c6c9aaa0b049e403087f2e4"} Dec 02 22:57:42 crc kubenswrapper[4696]: I1202 22:57:42.441508 4696 generic.go:334] "Generic (PLEG): container finished" podID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerID="f3fb38a090cfce456d840d4f5ceabeac8b8e658e6768fd89e4c269eece61d475" exitCode=0 Dec 02 22:57:42 crc kubenswrapper[4696]: I1202 22:57:42.441657 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" event={"ID":"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7","Type":"ContainerDied","Data":"f3fb38a090cfce456d840d4f5ceabeac8b8e658e6768fd89e4c269eece61d475"} Dec 02 22:57:43 crc kubenswrapper[4696]: I1202 22:57:43.452521 4696 generic.go:334] "Generic (PLEG): container finished" podID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerID="7b0bb6cd31be973b86db88113befd8ce7941828b46f8908e40e4a21e3273d1e6" exitCode=0 Dec 02 22:57:43 crc kubenswrapper[4696]: I1202 22:57:43.452628 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" event={"ID":"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7","Type":"ContainerDied","Data":"7b0bb6cd31be973b86db88113befd8ce7941828b46f8908e40e4a21e3273d1e6"} Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.811825 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.977346 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle\") pod \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.977557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util\") pod \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.977636 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7gbp\" (UniqueName: \"kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp\") pod \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\" (UID: \"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7\") " Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.979004 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle" (OuterVolumeSpecName: "bundle") pod "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" (UID: "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.987902 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp" (OuterVolumeSpecName: "kube-api-access-r7gbp") pod "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" (UID: "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7"). InnerVolumeSpecName "kube-api-access-r7gbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:57:44 crc kubenswrapper[4696]: I1202 22:57:44.993028 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util" (OuterVolumeSpecName: "util") pod "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" (UID: "e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.079964 4696 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.080017 4696 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-util\") on node \"crc\" DevicePath \"\"" Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.080033 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7gbp\" (UniqueName: \"kubernetes.io/projected/e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7-kube-api-access-r7gbp\") on node \"crc\" DevicePath \"\"" Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.472398 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" event={"ID":"e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7","Type":"ContainerDied","Data":"4c61cad3bdbcc31dcee6c49fc1cf545a678e714f3c6c9aaa0b049e403087f2e4"} Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.472461 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c61cad3bdbcc31dcee6c49fc1cf545a678e714f3c6c9aaa0b049e403087f2e4" Dec 02 22:57:45 crc kubenswrapper[4696]: I1202 22:57:45.472486 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.366459 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2"] Dec 02 22:57:47 crc kubenswrapper[4696]: E1202 22:57:47.367305 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="util" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.367321 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="util" Dec 02 22:57:47 crc kubenswrapper[4696]: E1202 22:57:47.367349 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="pull" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.367358 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="pull" Dec 02 22:57:47 crc kubenswrapper[4696]: E1202 22:57:47.367367 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="extract" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.367376 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="extract" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.367538 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7" containerName="extract" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.368266 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.376452 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-27f5z" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.386127 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2"] Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.416275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vnk\" (UniqueName: \"kubernetes.io/projected/f7e88453-0fd4-401a-92cd-f75809f14f21-kube-api-access-x6vnk\") pod \"openstack-operator-controller-operator-dfb58c988-g96v2\" (UID: \"f7e88453-0fd4-401a-92cd-f75809f14f21\") " pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.517597 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vnk\" (UniqueName: \"kubernetes.io/projected/f7e88453-0fd4-401a-92cd-f75809f14f21-kube-api-access-x6vnk\") pod \"openstack-operator-controller-operator-dfb58c988-g96v2\" (UID: \"f7e88453-0fd4-401a-92cd-f75809f14f21\") " pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.536534 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vnk\" (UniqueName: \"kubernetes.io/projected/f7e88453-0fd4-401a-92cd-f75809f14f21-kube-api-access-x6vnk\") pod \"openstack-operator-controller-operator-dfb58c988-g96v2\" (UID: \"f7e88453-0fd4-401a-92cd-f75809f14f21\") " pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.689622 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:47 crc kubenswrapper[4696]: I1202 22:57:47.959172 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2"] Dec 02 22:57:48 crc kubenswrapper[4696]: I1202 22:57:48.497799 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" event={"ID":"f7e88453-0fd4-401a-92cd-f75809f14f21","Type":"ContainerStarted","Data":"5745fc11fe24d7a41f36c799a5c8cc0bf0455fab0c0315d9e6d56d672a05daa4"} Dec 02 22:57:53 crc kubenswrapper[4696]: I1202 22:57:53.559802 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" event={"ID":"f7e88453-0fd4-401a-92cd-f75809f14f21","Type":"ContainerStarted","Data":"4a63d069194d0e4920ea928ec30b83f4f8e9e552a08da04b2cdb891e97d9a5a9"} Dec 02 22:57:53 crc kubenswrapper[4696]: I1202 22:57:53.560626 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:57:53 crc kubenswrapper[4696]: I1202 22:57:53.601133 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" podStartSLOduration=1.791067456 podStartE2EDuration="6.601104077s" podCreationTimestamp="2025-12-02 22:57:47 +0000 UTC" firstStartedPulling="2025-12-02 22:57:47.972519725 +0000 UTC m=+930.853199726" lastFinishedPulling="2025-12-02 22:57:52.782556346 +0000 UTC m=+935.663236347" observedRunningTime="2025-12-02 22:57:53.594672975 +0000 UTC m=+936.475353026" watchObservedRunningTime="2025-12-02 22:57:53.601104077 +0000 UTC m=+936.481784098" Dec 02 22:57:57 crc kubenswrapper[4696]: I1202 22:57:57.693913 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-dfb58c988-g96v2" Dec 02 22:58:01 crc kubenswrapper[4696]: I1202 22:58:01.494715 4696 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod81df406b-b45a-42a5-84c1-9e96c37e2ef4"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod81df406b-b45a-42a5-84c1-9e96c37e2ef4] : Timed out while waiting for systemd to remove kubepods-burstable-pod81df406b_b45a_42a5_84c1_9e96c37e2ef4.slice" Dec 02 22:58:01 crc kubenswrapper[4696]: E1202 22:58:01.495374 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod81df406b-b45a-42a5-84c1-9e96c37e2ef4] : unable to destroy cgroup paths for cgroup [kubepods burstable pod81df406b-b45a-42a5-84c1-9e96c37e2ef4] : Timed out while waiting for systemd to remove kubepods-burstable-pod81df406b_b45a_42a5_84c1_9e96c37e2ef4.slice" pod="openstack-operators/openstack-operator-index-xk9p8" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" Dec 02 22:58:01 crc kubenswrapper[4696]: I1202 22:58:01.623975 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xk9p8" Dec 02 22:58:01 crc kubenswrapper[4696]: I1202 22:58:01.652193 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:58:01 crc kubenswrapper[4696]: I1202 22:58:01.662606 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-xk9p8"] Dec 02 22:58:03 crc kubenswrapper[4696]: I1202 22:58:03.439662 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81df406b-b45a-42a5-84c1-9e96c37e2ef4" path="/var/lib/kubelet/pods/81df406b-b45a-42a5-84c1-9e96c37e2ef4/volumes" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.552310 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.554663 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.570385 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.712445 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw9p7\" (UniqueName: \"kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.712513 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.712553 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.814528 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw9p7\" (UniqueName: \"kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.814598 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.814643 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.815323 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.815407 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.848580 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw9p7\" (UniqueName: \"kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7\") pod \"certified-operators-b92fc\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:06 crc kubenswrapper[4696]: I1202 22:58:06.880245 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:07 crc kubenswrapper[4696]: I1202 22:58:07.299716 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:07 crc kubenswrapper[4696]: W1202 22:58:07.307324 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526e39ba_a8d3_4ac4_a90e_9867bc481a59.slice/crio-a5304c3c842d96c881ab66b4ca48a79b7c46268cf18301ca2e5ad688c45611d5 WatchSource:0}: Error finding container a5304c3c842d96c881ab66b4ca48a79b7c46268cf18301ca2e5ad688c45611d5: Status 404 returned error can't find the container with id a5304c3c842d96c881ab66b4ca48a79b7c46268cf18301ca2e5ad688c45611d5 Dec 02 22:58:07 crc kubenswrapper[4696]: I1202 22:58:07.665554 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerStarted","Data":"a5304c3c842d96c881ab66b4ca48a79b7c46268cf18301ca2e5ad688c45611d5"} Dec 02 22:58:08 crc kubenswrapper[4696]: I1202 22:58:08.679480 4696 generic.go:334] "Generic (PLEG): container finished" podID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerID="d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35" exitCode=0 Dec 02 22:58:08 crc kubenswrapper[4696]: I1202 22:58:08.679555 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerDied","Data":"d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35"} Dec 02 22:58:09 crc kubenswrapper[4696]: I1202 22:58:09.689454 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerStarted","Data":"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce"} Dec 02 22:58:10 crc kubenswrapper[4696]: I1202 22:58:10.714243 4696 generic.go:334] "Generic (PLEG): container finished" podID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerID="94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce" exitCode=0 Dec 02 22:58:10 crc kubenswrapper[4696]: I1202 22:58:10.714815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerDied","Data":"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce"} Dec 02 22:58:11 crc kubenswrapper[4696]: I1202 22:58:11.724437 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerStarted","Data":"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a"} Dec 02 22:58:11 crc kubenswrapper[4696]: I1202 22:58:11.757301 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b92fc" podStartSLOduration=3.272034612 podStartE2EDuration="5.757279504s" podCreationTimestamp="2025-12-02 22:58:06 +0000 UTC" firstStartedPulling="2025-12-02 22:58:08.682613447 +0000 UTC m=+951.563293488" lastFinishedPulling="2025-12-02 22:58:11.167858369 +0000 UTC m=+954.048538380" observedRunningTime="2025-12-02 22:58:11.75395804 +0000 UTC m=+954.634638051" watchObservedRunningTime="2025-12-02 22:58:11.757279504 +0000 UTC m=+954.637959505" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.000070 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.004812 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.014091 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.056242 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.056354 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp994\" (UniqueName: \"kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.056387 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.158077 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp994\" (UniqueName: \"kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.158159 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.158225 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.159104 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.159151 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.180128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp994\" (UniqueName: \"kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994\") pod \"community-operators-sp4gt\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.338821 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.881325 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.881877 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.943019 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:16 crc kubenswrapper[4696]: I1202 22:58:16.971562 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.611722 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.613347 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.616029 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fgs6m" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.632809 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.634003 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.639917 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hxr78" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.648125 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.655102 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.672429 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.673531 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.677445 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-n5m79" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.685943 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbzb\" (UniqueName: \"kubernetes.io/projected/351a13fb-8e8e-4393-adef-28523ab05ccb-kube-api-access-llbzb\") pod \"barbican-operator-controller-manager-7d9dfd778-xxm5w\" (UID: \"351a13fb-8e8e-4393-adef-28523ab05ccb\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.686000 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5j2b\" (UniqueName: \"kubernetes.io/projected/755f9574-a31b-430c-a2a2-92554020d96b-kube-api-access-z5j2b\") pod \"cinder-operator-controller-manager-859b6ccc6-rjlxh\" (UID: \"755f9574-a31b-430c-a2a2-92554020d96b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.686035 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52ktm\" (UniqueName: \"kubernetes.io/projected/5706d5c2-8bbe-40b3-8820-0d547363fa96-kube-api-access-52ktm\") pod \"designate-operator-controller-manager-78b4bc895b-tzgnf\" (UID: \"5706d5c2-8bbe-40b3-8820-0d547363fa96\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.725696 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.727176 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.730105 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9hfw8" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.749995 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.751214 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.755929 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mpvhw" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.771867 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.790530 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbzb\" (UniqueName: \"kubernetes.io/projected/351a13fb-8e8e-4393-adef-28523ab05ccb-kube-api-access-llbzb\") pod \"barbican-operator-controller-manager-7d9dfd778-xxm5w\" (UID: \"351a13fb-8e8e-4393-adef-28523ab05ccb\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.790889 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5j2b\" (UniqueName: \"kubernetes.io/projected/755f9574-a31b-430c-a2a2-92554020d96b-kube-api-access-z5j2b\") pod \"cinder-operator-controller-manager-859b6ccc6-rjlxh\" (UID: \"755f9574-a31b-430c-a2a2-92554020d96b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.790962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52ktm\" (UniqueName: \"kubernetes.io/projected/5706d5c2-8bbe-40b3-8820-0d547363fa96-kube-api-access-52ktm\") pod \"designate-operator-controller-manager-78b4bc895b-tzgnf\" (UID: \"5706d5c2-8bbe-40b3-8820-0d547363fa96\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.791583 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.800296 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.801594 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerStarted","Data":"01cf7f435d069c3f2b220a7879e313771a3c425c388bc13671f98b080b295bd8"} Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.808993 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerStarted","Data":"b862c0816a6b1fb505663b78545dfe42015a3ecb93d9a201178083438a623556"} Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.815203 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.822077 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4kcd7" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.841001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52ktm\" (UniqueName: \"kubernetes.io/projected/5706d5c2-8bbe-40b3-8820-0d547363fa96-kube-api-access-52ktm\") pod \"designate-operator-controller-manager-78b4bc895b-tzgnf\" (UID: \"5706d5c2-8bbe-40b3-8820-0d547363fa96\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.856404 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.858161 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5j2b\" (UniqueName: \"kubernetes.io/projected/755f9574-a31b-430c-a2a2-92554020d96b-kube-api-access-z5j2b\") pod \"cinder-operator-controller-manager-859b6ccc6-rjlxh\" (UID: \"755f9574-a31b-430c-a2a2-92554020d96b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.877983 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbzb\" (UniqueName: \"kubernetes.io/projected/351a13fb-8e8e-4393-adef-28523ab05ccb-kube-api-access-llbzb\") pod \"barbican-operator-controller-manager-7d9dfd778-xxm5w\" (UID: \"351a13fb-8e8e-4393-adef-28523ab05ccb\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.895619 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.898266 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dglp\" (UniqueName: \"kubernetes.io/projected/5315d589-3bb7-4776-b842-ffc18e1a89e1-kube-api-access-2dglp\") pod \"heat-operator-controller-manager-5f64f6f8bb-s9pk4\" (UID: \"5315d589-3bb7-4776-b842-ffc18e1a89e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.902097 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkhg4\" (UniqueName: \"kubernetes.io/projected/e693a226-52c3-413c-b607-c0050ab5e553-kube-api-access-rkhg4\") pod \"glance-operator-controller-manager-77987cd8cd-nvqf8\" (UID: \"e693a226-52c3-413c-b607-c0050ab5e553\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.917986 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.951711 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.962326 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.973547 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9"] Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.975097 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:58:17 crc kubenswrapper[4696]: I1202 22:58:17.979056 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ngvwh" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.004972 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrd6\" (UniqueName: \"kubernetes.io/projected/7d7b7caa-1ec3-4e66-9273-36cae02cbe8e-kube-api-access-cnrd6\") pod \"horizon-operator-controller-manager-68c6d99b8f-tng2n\" (UID: \"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.005018 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkhg4\" (UniqueName: \"kubernetes.io/projected/e693a226-52c3-413c-b607-c0050ab5e553-kube-api-access-rkhg4\") pod \"glance-operator-controller-manager-77987cd8cd-nvqf8\" (UID: \"e693a226-52c3-413c-b607-c0050ab5e553\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.005087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dglp\" (UniqueName: \"kubernetes.io/projected/5315d589-3bb7-4776-b842-ffc18e1a89e1-kube-api-access-2dglp\") pod \"heat-operator-controller-manager-5f64f6f8bb-s9pk4\" (UID: \"5315d589-3bb7-4776-b842-ffc18e1a89e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.005137 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfh8h\" (UniqueName: \"kubernetes.io/projected/bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094-kube-api-access-mfh8h\") pod \"ironic-operator-controller-manager-6c548fd776-4crn9\" (UID: \"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.031507 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.045277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dglp\" (UniqueName: \"kubernetes.io/projected/5315d589-3bb7-4776-b842-ffc18e1a89e1-kube-api-access-2dglp\") pod \"heat-operator-controller-manager-5f64f6f8bb-s9pk4\" (UID: \"5315d589-3bb7-4776-b842-ffc18e1a89e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.045375 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.046630 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.049171 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.049471 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-98cdv" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.049823 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkhg4\" (UniqueName: \"kubernetes.io/projected/e693a226-52c3-413c-b607-c0050ab5e553-kube-api-access-rkhg4\") pod \"glance-operator-controller-manager-77987cd8cd-nvqf8\" (UID: \"e693a226-52c3-413c-b607-c0050ab5e553\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.078195 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.081364 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.085241 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.088316 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.089805 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.091984 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6dv8b" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.096799 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.098341 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.103234 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.104083 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-grfd9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.105951 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t5dt\" (UniqueName: \"kubernetes.io/projected/5436ce3c-34d6-47eb-81b1-3b4dc1c2d794-kube-api-access-2t5dt\") pod \"manila-operator-controller-manager-7c79b5df47-k2p9j\" (UID: \"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.105992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47t2\" (UniqueName: \"kubernetes.io/projected/66d51ef3-89ba-4653-ae46-5469bfc5232e-kube-api-access-f47t2\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.106018 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfh8h\" (UniqueName: \"kubernetes.io/projected/bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094-kube-api-access-mfh8h\") pod \"ironic-operator-controller-manager-6c548fd776-4crn9\" (UID: \"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.106036 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrd6\" (UniqueName: \"kubernetes.io/projected/7d7b7caa-1ec3-4e66-9273-36cae02cbe8e-kube-api-access-cnrd6\") pod \"horizon-operator-controller-manager-68c6d99b8f-tng2n\" (UID: \"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.106060 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6xr\" (UniqueName: \"kubernetes.io/projected/9207b2f0-999a-45e4-8234-982f796f7801-kube-api-access-zx6xr\") pod \"keystone-operator-controller-manager-7765d96ddf-vg9kf\" (UID: \"9207b2f0-999a-45e4-8234-982f796f7801\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.106081 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.113734 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.116476 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.123510 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qptfs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.127005 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrd6\" (UniqueName: \"kubernetes.io/projected/7d7b7caa-1ec3-4e66-9273-36cae02cbe8e-kube-api-access-cnrd6\") pod \"horizon-operator-controller-manager-68c6d99b8f-tng2n\" (UID: \"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.127208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.127516 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfh8h\" (UniqueName: \"kubernetes.io/projected/bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094-kube-api-access-mfh8h\") pod \"ironic-operator-controller-manager-6c548fd776-4crn9\" (UID: \"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.139932 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.147969 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.153494 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.154859 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.158528 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ndrjz" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.159053 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.160997 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.163553 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-w6jmh" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.167652 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.185419 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.211614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t5dt\" (UniqueName: \"kubernetes.io/projected/5436ce3c-34d6-47eb-81b1-3b4dc1c2d794-kube-api-access-2t5dt\") pod \"manila-operator-controller-manager-7c79b5df47-k2p9j\" (UID: \"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.211668 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47t2\" (UniqueName: \"kubernetes.io/projected/66d51ef3-89ba-4653-ae46-5469bfc5232e-kube-api-access-f47t2\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.211704 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6xr\" (UniqueName: \"kubernetes.io/projected/9207b2f0-999a-45e4-8234-982f796f7801-kube-api-access-zx6xr\") pod \"keystone-operator-controller-manager-7765d96ddf-vg9kf\" (UID: \"9207b2f0-999a-45e4-8234-982f796f7801\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.211757 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.211913 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.211972 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert podName:66d51ef3-89ba-4653-ae46-5469bfc5232e nodeName:}" failed. No retries permitted until 2025-12-02 22:58:18.71195363 +0000 UTC m=+961.592633631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert") pod "infra-operator-controller-manager-57548d458d-kz6bs" (UID: "66d51ef3-89ba-4653-ae46-5469bfc5232e") : secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.226357 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bpk25"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.228007 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.235082 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6xr\" (UniqueName: \"kubernetes.io/projected/9207b2f0-999a-45e4-8234-982f796f7801-kube-api-access-zx6xr\") pod \"keystone-operator-controller-manager-7765d96ddf-vg9kf\" (UID: \"9207b2f0-999a-45e4-8234-982f796f7801\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.236196 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5tpwk" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.238461 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47t2\" (UniqueName: \"kubernetes.io/projected/66d51ef3-89ba-4653-ae46-5469bfc5232e-kube-api-access-f47t2\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.240997 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bpk25"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.258439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t5dt\" (UniqueName: \"kubernetes.io/projected/5436ce3c-34d6-47eb-81b1-3b4dc1c2d794-kube-api-access-2t5dt\") pod \"manila-operator-controller-manager-7c79b5df47-k2p9j\" (UID: \"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.285233 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.288217 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.297553 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.297810 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nts8s" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.299302 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.300871 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.306442 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mr96r" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.313201 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2cqd\" (UniqueName: \"kubernetes.io/projected/baad852a-374a-460e-9d5c-cb5418291849-kube-api-access-q2cqd\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tp7td\" (UID: \"baad852a-374a-460e-9d5c-cb5418291849\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.313259 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sc4\" (UniqueName: \"kubernetes.io/projected/1beb3e53-4faf-475f-b5b0-57b8cd32c529-kube-api-access-j4sc4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d6qww\" (UID: \"1beb3e53-4faf-475f-b5b0-57b8cd32c529\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.314915 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvl2\" (UniqueName: \"kubernetes.io/projected/9c1744f3-fc58-4653-a7e0-4fcdfdfca485-kube-api-access-lbvl2\") pod \"nova-operator-controller-manager-697bc559fc-hfcc6\" (UID: \"9c1744f3-fc58-4653-a7e0-4fcdfdfca485\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.371280 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.405264 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.406664 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.409272 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.413296 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qd6f9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvl2\" (UniqueName: \"kubernetes.io/projected/9c1744f3-fc58-4653-a7e0-4fcdfdfca485-kube-api-access-lbvl2\") pod \"nova-operator-controller-manager-697bc559fc-hfcc6\" (UID: \"9c1744f3-fc58-4653-a7e0-4fcdfdfca485\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420515 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sb2b\" (UniqueName: \"kubernetes.io/projected/77131fa7-a611-46bf-b0fe-d05d909dfd4c-kube-api-access-7sb2b\") pod \"ovn-operator-controller-manager-b6456fdb6-pwblb\" (UID: \"77131fa7-a611-46bf-b0fe-d05d909dfd4c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420547 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkq8p\" (UniqueName: \"kubernetes.io/projected/060c8046-7775-413d-9797-ef0edcee01dd-kube-api-access-tkq8p\") pod \"octavia-operator-controller-manager-998648c74-bpk25\" (UID: \"060c8046-7775-413d-9797-ef0edcee01dd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420568 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2cqd\" (UniqueName: \"kubernetes.io/projected/baad852a-374a-460e-9d5c-cb5418291849-kube-api-access-q2cqd\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tp7td\" (UID: \"baad852a-374a-460e-9d5c-cb5418291849\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420608 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sc4\" (UniqueName: \"kubernetes.io/projected/1beb3e53-4faf-475f-b5b0-57b8cd32c529-kube-api-access-j4sc4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d6qww\" (UID: \"1beb3e53-4faf-475f-b5b0-57b8cd32c529\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420634 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkl8\" (UniqueName: \"kubernetes.io/projected/3d6a00c3-b537-414a-8ba4-2797d7bc88f8-kube-api-access-hkkl8\") pod \"placement-operator-controller-manager-78f8948974-g5pxw\" (UID: \"3d6a00c3-b537-414a-8ba4-2797d7bc88f8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420666 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.420701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgttj\" (UniqueName: \"kubernetes.io/projected/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-kube-api-access-pgttj\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.441778 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.441956 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.471678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvl2\" (UniqueName: \"kubernetes.io/projected/9c1744f3-fc58-4653-a7e0-4fcdfdfca485-kube-api-access-lbvl2\") pod \"nova-operator-controller-manager-697bc559fc-hfcc6\" (UID: \"9c1744f3-fc58-4653-a7e0-4fcdfdfca485\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.475345 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.475475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2cqd\" (UniqueName: \"kubernetes.io/projected/baad852a-374a-460e-9d5c-cb5418291849-kube-api-access-q2cqd\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tp7td\" (UID: \"baad852a-374a-460e-9d5c-cb5418291849\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.477503 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.479409 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sc4\" (UniqueName: \"kubernetes.io/projected/1beb3e53-4faf-475f-b5b0-57b8cd32c529-kube-api-access-j4sc4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d6qww\" (UID: \"1beb3e53-4faf-475f-b5b0-57b8cd32c529\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.481602 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.487223 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zfhrx" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.490935 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.491542 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.492510 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.493955 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-djqnt" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.497988 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.498586 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.523806 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529291 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgttj\" (UniqueName: \"kubernetes.io/projected/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-kube-api-access-pgttj\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529384 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq4s\" (UniqueName: \"kubernetes.io/projected/45f0f590-24f6-4f01-98a0-a41508a59f5a-kube-api-access-gvq4s\") pod \"telemetry-operator-controller-manager-76cc84c6bb-58fzc\" (UID: \"45f0f590-24f6-4f01-98a0-a41508a59f5a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529413 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sb2b\" (UniqueName: \"kubernetes.io/projected/77131fa7-a611-46bf-b0fe-d05d909dfd4c-kube-api-access-7sb2b\") pod \"ovn-operator-controller-manager-b6456fdb6-pwblb\" (UID: \"77131fa7-a611-46bf-b0fe-d05d909dfd4c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529450 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkq8p\" (UniqueName: \"kubernetes.io/projected/060c8046-7775-413d-9797-ef0edcee01dd-kube-api-access-tkq8p\") pod \"octavia-operator-controller-manager-998648c74-bpk25\" (UID: \"060c8046-7775-413d-9797-ef0edcee01dd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529501 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkl8\" (UniqueName: \"kubernetes.io/projected/3d6a00c3-b537-414a-8ba4-2797d7bc88f8-kube-api-access-hkkl8\") pod \"placement-operator-controller-manager-78f8948974-g5pxw\" (UID: \"3d6a00c3-b537-414a-8ba4-2797d7bc88f8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529522 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrv8n\" (UniqueName: \"kubernetes.io/projected/9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e-kube-api-access-nrv8n\") pod \"swift-operator-controller-manager-5f8c65bbfc-m472w\" (UID: \"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.529550 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.529700 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.529764 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert podName:6e335d65-9d0f-4ace-97cc-70a4a2bb2291 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:19.029750046 +0000 UTC m=+961.910430047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" (UID: "6e335d65-9d0f-4ace-97cc-70a4a2bb2291") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.538060 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.549518 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.569184 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.578384 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.580348 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.590482 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkq8p\" (UniqueName: \"kubernetes.io/projected/060c8046-7775-413d-9797-ef0edcee01dd-kube-api-access-tkq8p\") pod \"octavia-operator-controller-manager-998648c74-bpk25\" (UID: \"060c8046-7775-413d-9797-ef0edcee01dd\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.590697 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-dv5k8" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.591181 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgttj\" (UniqueName: \"kubernetes.io/projected/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-kube-api-access-pgttj\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.591203 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkl8\" (UniqueName: \"kubernetes.io/projected/3d6a00c3-b537-414a-8ba4-2797d7bc88f8-kube-api-access-hkkl8\") pod \"placement-operator-controller-manager-78f8948974-g5pxw\" (UID: \"3d6a00c3-b537-414a-8ba4-2797d7bc88f8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.591302 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.591571 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sb2b\" (UniqueName: \"kubernetes.io/projected/77131fa7-a611-46bf-b0fe-d05d909dfd4c-kube-api-access-7sb2b\") pod \"ovn-operator-controller-manager-b6456fdb6-pwblb\" (UID: \"77131fa7-a611-46bf-b0fe-d05d909dfd4c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.634691 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6zn5\" (UniqueName: \"kubernetes.io/projected/796c18e3-0c33-4393-aba8-2ad03aad4b93-kube-api-access-n6zn5\") pod \"test-operator-controller-manager-5854674fcc-jfcsj\" (UID: \"796c18e3-0c33-4393-aba8-2ad03aad4b93\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.634780 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrv8n\" (UniqueName: \"kubernetes.io/projected/9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e-kube-api-access-nrv8n\") pod \"swift-operator-controller-manager-5f8c65bbfc-m472w\" (UID: \"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.634869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq4s\" (UniqueName: \"kubernetes.io/projected/45f0f590-24f6-4f01-98a0-a41508a59f5a-kube-api-access-gvq4s\") pod \"telemetry-operator-controller-manager-76cc84c6bb-58fzc\" (UID: \"45f0f590-24f6-4f01-98a0-a41508a59f5a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.657124 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.680829 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.682371 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.688168 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrv8n\" (UniqueName: \"kubernetes.io/projected/9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e-kube-api-access-nrv8n\") pod \"swift-operator-controller-manager-5f8c65bbfc-m472w\" (UID: \"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.691174 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wb6lm" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.694451 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.696598 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq4s\" (UniqueName: \"kubernetes.io/projected/45f0f590-24f6-4f01-98a0-a41508a59f5a-kube-api-access-gvq4s\") pod \"telemetry-operator-controller-manager-76cc84c6bb-58fzc\" (UID: \"45f0f590-24f6-4f01-98a0-a41508a59f5a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.736156 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.737362 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswq5\" (UniqueName: \"kubernetes.io/projected/d417ecee-aebb-4154-ac0c-2c321bd78182-kube-api-access-fswq5\") pod \"watcher-operator-controller-manager-d4477bdf4-lxz2l\" (UID: \"d417ecee-aebb-4154-ac0c-2c321bd78182\") " pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.737430 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.737456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zn5\" (UniqueName: \"kubernetes.io/projected/796c18e3-0c33-4393-aba8-2ad03aad4b93-kube-api-access-n6zn5\") pod \"test-operator-controller-manager-5854674fcc-jfcsj\" (UID: \"796c18e3-0c33-4393-aba8-2ad03aad4b93\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.737611 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.737680 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert podName:66d51ef3-89ba-4653-ae46-5469bfc5232e nodeName:}" failed. No retries permitted until 2025-12-02 22:58:19.737662122 +0000 UTC m=+962.618342123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert") pod "infra-operator-controller-manager-57548d458d-kz6bs" (UID: "66d51ef3-89ba-4653-ae46-5469bfc5232e") : secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.750143 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.752629 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.759677 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.759997 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x8gxs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.760116 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.765328 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.769245 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.797081 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.810142 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.811370 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.822399 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7zd8c" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.824858 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zn5\" (UniqueName: \"kubernetes.io/projected/796c18e3-0c33-4393-aba8-2ad03aad4b93-kube-api-access-n6zn5\") pod \"test-operator-controller-manager-5854674fcc-jfcsj\" (UID: \"796c18e3-0c33-4393-aba8-2ad03aad4b93\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.839151 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs"] Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.841237 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswq5\" (UniqueName: \"kubernetes.io/projected/d417ecee-aebb-4154-ac0c-2c321bd78182-kube-api-access-fswq5\") pod \"watcher-operator-controller-manager-d4477bdf4-lxz2l\" (UID: \"d417ecee-aebb-4154-ac0c-2c321bd78182\") " pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.841352 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.841377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.841415 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjbr\" (UniqueName: \"kubernetes.io/projected/e53eb416-2701-4080-b0a3-bbeae35013a4-kube-api-access-9fjbr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b2wbs\" (UID: \"e53eb416-2701-4080-b0a3-bbeae35013a4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.841456 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbs9\" (UniqueName: \"kubernetes.io/projected/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-kube-api-access-jkbs9\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.875841 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswq5\" (UniqueName: \"kubernetes.io/projected/d417ecee-aebb-4154-ac0c-2c321bd78182-kube-api-access-fswq5\") pod \"watcher-operator-controller-manager-d4477bdf4-lxz2l\" (UID: \"d417ecee-aebb-4154-ac0c-2c321bd78182\") " pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.879417 4696 generic.go:334] "Generic (PLEG): container finished" podID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerID="01cf7f435d069c3f2b220a7879e313771a3c425c388bc13671f98b080b295bd8" exitCode=0 Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.880812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerDied","Data":"01cf7f435d069c3f2b220a7879e313771a3c425c388bc13671f98b080b295bd8"} Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.883673 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.942600 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.942652 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.942688 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjbr\" (UniqueName: \"kubernetes.io/projected/e53eb416-2701-4080-b0a3-bbeae35013a4-kube-api-access-9fjbr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b2wbs\" (UID: \"e53eb416-2701-4080-b0a3-bbeae35013a4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.942722 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbs9\" (UniqueName: \"kubernetes.io/projected/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-kube-api-access-jkbs9\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.948639 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.948713 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:19.448697916 +0000 UTC m=+962.329377917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "metrics-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.948786 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: E1202 22:58:18.948806 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:19.448799779 +0000 UTC m=+962.329479780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:18 crc kubenswrapper[4696]: I1202 22:58:18.978998 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.005955 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjbr\" (UniqueName: \"kubernetes.io/projected/e53eb416-2701-4080-b0a3-bbeae35013a4-kube-api-access-9fjbr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b2wbs\" (UID: \"e53eb416-2701-4080-b0a3-bbeae35013a4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.012951 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbs9\" (UniqueName: \"kubernetes.io/projected/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-kube-api-access-jkbs9\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.053255 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.054057 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.054150 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert podName:6e335d65-9d0f-4ace-97cc-70a4a2bb2291 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:20.05412428 +0000 UTC m=+962.934804441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" (UID: "6e335d65-9d0f-4ace-97cc-70a4a2bb2291") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.118788 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.128835 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.129017 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.147172 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.154039 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.160849 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.209945 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode693a226_52c3_413c_b607_c0050ab5e553.slice/crio-c505f40e8e5b825435eac153138f0f0e20b44a41a6e76c1b1fc963c621b45979 WatchSource:0}: Error finding container c505f40e8e5b825435eac153138f0f0e20b44a41a6e76c1b1fc963c621b45979: Status 404 returned error can't find the container with id c505f40e8e5b825435eac153138f0f0e20b44a41a6e76c1b1fc963c621b45979 Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.213049 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5706d5c2_8bbe_40b3_8820_0d547363fa96.slice/crio-578d1f76b88e83cac6b86e760a78f6adb9e2b045916004ad2b5e4cc0250dd45b WatchSource:0}: Error finding container 578d1f76b88e83cac6b86e760a78f6adb9e2b045916004ad2b5e4cc0250dd45b: Status 404 returned error can't find the container with id 578d1f76b88e83cac6b86e760a78f6adb9e2b045916004ad2b5e4cc0250dd45b Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.335309 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.426198 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.464654 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.464699 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.464870 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.464946 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:20.464924968 +0000 UTC m=+963.345604969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "metrics-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.464971 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.465026 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:20.465010171 +0000 UTC m=+963.345690172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.683438 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.692541 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.707400 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7b7caa_1ec3_4e66_9273_36cae02cbe8e.slice/crio-f338b3daa862a2848a97da8a9ec42a151ff209e43183690072608b1b58fef822 WatchSource:0}: Error finding container f338b3daa862a2848a97da8a9ec42a151ff209e43183690072608b1b58fef822: Status 404 returned error can't find the container with id f338b3daa862a2848a97da8a9ec42a151ff209e43183690072608b1b58fef822 Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.709297 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaad852a_374a_460e_9d5c_cb5418291849.slice/crio-220bd485d022d6ae80bd8a31f4d5c15beb063fe837d34f967b6ca5228e8ff0ae WatchSource:0}: Error finding container 220bd485d022d6ae80bd8a31f4d5c15beb063fe837d34f967b6ca5228e8ff0ae: Status 404 returned error can't find the container with id 220bd485d022d6ae80bd8a31f4d5c15beb063fe837d34f967b6ca5228e8ff0ae Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.710607 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.716432 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9207b2f0_999a_45e4_8234_982f796f7801.slice/crio-488da82b52c4f00693e2dd3a662be7ff4c567af43130c55a37de48e7045a4547 WatchSource:0}: Error finding container 488da82b52c4f00693e2dd3a662be7ff4c567af43130c55a37de48e7045a4547: Status 404 returned error can't find the container with id 488da82b52c4f00693e2dd3a662be7ff4c567af43130c55a37de48e7045a4547 Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.716863 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.722339 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.773022 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.773223 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.773338 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert podName:66d51ef3-89ba-4653-ae46-5469bfc5232e nodeName:}" failed. No retries permitted until 2025-12-02 22:58:21.773317158 +0000 UTC m=+964.653997159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert") pod "infra-operator-controller-manager-57548d458d-kz6bs" (UID: "66d51ef3-89ba-4653-ae46-5469bfc5232e") : secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.810109 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.848962 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bpk25"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.892854 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.904096 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.904321 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6a00c3_b537_414a_8ba4_2797d7bc88f8.slice/crio-37419b91bf936ac1746c4ea75088ea469ca0a09244945cc6666b2dc1c340a7bc WatchSource:0}: Error finding container 37419b91bf936ac1746c4ea75088ea469ca0a09244945cc6666b2dc1c340a7bc: Status 404 returned error can't find the container with id 37419b91bf936ac1746c4ea75088ea469ca0a09244945cc6666b2dc1c340a7bc Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.914102 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.921705 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.922112 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd417ecee_aebb_4154_ac0c_2c321bd78182.slice/crio-c609edd35cecd032d5232290c49659b76f6e9515fddcf5e22520240fd7965e21 WatchSource:0}: Error finding container c609edd35cecd032d5232290c49659b76f6e9515fddcf5e22520240fd7965e21: Status 404 returned error can't find the container with id c609edd35cecd032d5232290c49659b76f6e9515fddcf5e22520240fd7965e21 Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.926300 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.931247 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww"] Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.935798 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj"] Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.938312 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab3fbd9_1610_4d0c_aa5e_2c298e5dcc3e.slice/crio-36a3d81c737db664d4d186cc030376b1a7a05c916c921fbf22ead7b85ba432b1 WatchSource:0}: Error finding container 36a3d81c737db664d4d186cc030376b1a7a05c916c921fbf22ead7b85ba432b1: Status 404 returned error can't find the container with id 36a3d81c737db664d4d186cc030376b1a7a05c916c921fbf22ead7b85ba432b1 Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.940533 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l"] Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.942049 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j4sc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-d6qww_openstack-operators(1beb3e53-4faf-475f-b5b0-57b8cd32c529): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.944138 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j4sc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-d6qww_openstack-operators(1beb3e53-4faf-475f-b5b0-57b8cd32c529): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.944312 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.45:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fswq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-d4477bdf4-lxz2l_openstack-operators(d417ecee-aebb-4154-ac0c-2c321bd78182): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.945324 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" podUID="1beb3e53-4faf-475f-b5b0-57b8cd32c529" Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.945945 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1744f3_fc58_4653_a7e0_4fcdfdfca485.slice/crio-6514026fa128b10fa17a36b8bd66e206aeca81faaa855b758dc02aa11bf0191d WatchSource:0}: Error finding container 6514026fa128b10fa17a36b8bd66e206aeca81faaa855b758dc02aa11bf0191d: Status 404 returned error can't find the container with id 6514026fa128b10fa17a36b8bd66e206aeca81faaa855b758dc02aa11bf0191d Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.947010 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fswq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-d4477bdf4-lxz2l_openstack-operators(d417ecee-aebb-4154-ac0c-2c321bd78182): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.947382 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrv8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-m472w_openstack-operators(9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.948428 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" podUID="d417ecee-aebb-4154-ac0c-2c321bd78182" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.949618 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrv8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-m472w_openstack-operators(9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.949920 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbvl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-hfcc6_openstack-operators(9c1744f3-fc58-4653-a7e0-4fcdfdfca485): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.949965 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod796c18e3_0c33_4393_aba8_2ad03aad4b93.slice/crio-85ccf634674bcd0230a99d94e1db9db2ea05c9ec947a3b0346049c242e443f66 WatchSource:0}: Error finding container 85ccf634674bcd0230a99d94e1db9db2ea05c9ec947a3b0346049c242e443f66: Status 404 returned error can't find the container with id 85ccf634674bcd0230a99d94e1db9db2ea05c9ec947a3b0346049c242e443f66 Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.952885 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" podUID="9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.953624 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbvl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-hfcc6_openstack-operators(9c1744f3-fc58-4653-a7e0-4fcdfdfca485): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: W1202 22:58:19.953791 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f0f590_24f6_4f01_98a0_a41508a59f5a.slice/crio-55e9fb306228ca4d071927cedb4b7a617f2591f666e244823c2bca714807b02d WatchSource:0}: Error finding container 55e9fb306228ca4d071927cedb4b7a617f2591f666e244823c2bca714807b02d: Status 404 returned error can't find the container with id 55e9fb306228ca4d071927cedb4b7a617f2591f666e244823c2bca714807b02d Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.954870 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" podUID="9c1744f3-fc58-4653-a7e0-4fcdfdfca485" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.956667 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" event={"ID":"baad852a-374a-460e-9d5c-cb5418291849","Type":"ContainerStarted","Data":"220bd485d022d6ae80bd8a31f4d5c15beb063fe837d34f967b6ca5228e8ff0ae"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.967460 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" event={"ID":"9207b2f0-999a-45e4-8234-982f796f7801","Type":"ContainerStarted","Data":"488da82b52c4f00693e2dd3a662be7ff4c567af43130c55a37de48e7045a4547"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.969820 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" event={"ID":"351a13fb-8e8e-4393-adef-28523ab05ccb","Type":"ContainerStarted","Data":"3a630abc65a49d925642524de70a87a26ea51e4d55304a996cd5f4e62412d36d"} Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.970336 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gvq4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-58fzc_openstack-operators(45f0f590-24f6-4f01-98a0-a41508a59f5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.970370 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9fjbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-b2wbs_openstack-operators(e53eb416-2701-4080-b0a3-bbeae35013a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.970631 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n6zn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-jfcsj_openstack-operators(796c18e3-0c33-4393-aba8-2ad03aad4b93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.971785 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" event={"ID":"755f9574-a31b-430c-a2a2-92554020d96b","Type":"ContainerStarted","Data":"1fe3069f214e2fdba90561824d17953291e5a982a834ecc1a29d1236dce9ce8f"} Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.971859 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" podUID="e53eb416-2701-4080-b0a3-bbeae35013a4" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.972554 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n6zn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-jfcsj_openstack-operators(796c18e3-0c33-4393-aba8-2ad03aad4b93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.972644 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gvq4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-58fzc_openstack-operators(45f0f590-24f6-4f01-98a0-a41508a59f5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.973292 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" event={"ID":"5315d589-3bb7-4776-b842-ffc18e1a89e1","Type":"ContainerStarted","Data":"2e938b448e4f9142a0dd6d188b92eebff9c87cf0a3deeef40339cac3cde89254"} Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.973655 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" podUID="796c18e3-0c33-4393-aba8-2ad03aad4b93" Dec 02 22:58:19 crc kubenswrapper[4696]: E1202 22:58:19.973831 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" podUID="45f0f590-24f6-4f01-98a0-a41508a59f5a" Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.975705 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" event={"ID":"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e","Type":"ContainerStarted","Data":"f338b3daa862a2848a97da8a9ec42a151ff209e43183690072608b1b58fef822"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.976757 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" event={"ID":"e693a226-52c3-413c-b607-c0050ab5e553","Type":"ContainerStarted","Data":"c505f40e8e5b825435eac153138f0f0e20b44a41a6e76c1b1fc963c621b45979"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.978798 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" event={"ID":"77131fa7-a611-46bf-b0fe-d05d909dfd4c","Type":"ContainerStarted","Data":"c5af4a78a349037568e3aaa0ee0aefb313afe7cf0e2fe2d3a7a1ec25daa6915b"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.980627 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" event={"ID":"060c8046-7775-413d-9797-ef0edcee01dd","Type":"ContainerStarted","Data":"bcbeb40f27f16736394890fc3bb6e40f537146917b84fa7c8ef39bb159b77c91"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.982883 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" event={"ID":"5706d5c2-8bbe-40b3-8820-0d547363fa96","Type":"ContainerStarted","Data":"578d1f76b88e83cac6b86e760a78f6adb9e2b045916004ad2b5e4cc0250dd45b"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.986222 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" event={"ID":"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094","Type":"ContainerStarted","Data":"73302f668a78c87934fd722c0fd1698a11f9cd4e3a11a9a1c873ca50fe2257bb"} Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.987715 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b92fc" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="registry-server" containerID="cri-o://79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a" gracePeriod=2 Dec 02 22:58:19 crc kubenswrapper[4696]: I1202 22:58:19.987813 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" event={"ID":"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794","Type":"ContainerStarted","Data":"264a299b223d3e0366c18b9e93347ce26698a2996b3129e3b79ab004e38b01e0"} Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.080406 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.080629 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.080717 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert podName:6e335d65-9d0f-4ace-97cc-70a4a2bb2291 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:22.080693109 +0000 UTC m=+964.961373110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" (UID: "6e335d65-9d0f-4ace-97cc-70a4a2bb2291") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.472006 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.491747 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.491795 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.491959 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.492014 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:22.491998503 +0000 UTC m=+965.372678504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.492056 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: E1202 22:58:20.492074 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:22.492067995 +0000 UTC m=+965.372747996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "metrics-server-cert" not found Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.592921 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw9p7\" (UniqueName: \"kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7\") pod \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.593086 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content\") pod \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.593284 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities\") pod \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\" (UID: \"526e39ba-a8d3-4ac4-a90e-9867bc481a59\") " Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.594379 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities" (OuterVolumeSpecName: "utilities") pod "526e39ba-a8d3-4ac4-a90e-9867bc481a59" (UID: "526e39ba-a8d3-4ac4-a90e-9867bc481a59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.601712 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7" (OuterVolumeSpecName: "kube-api-access-jw9p7") pod "526e39ba-a8d3-4ac4-a90e-9867bc481a59" (UID: "526e39ba-a8d3-4ac4-a90e-9867bc481a59"). InnerVolumeSpecName "kube-api-access-jw9p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.701858 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.701938 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw9p7\" (UniqueName: \"kubernetes.io/projected/526e39ba-a8d3-4ac4-a90e-9867bc481a59-kube-api-access-jw9p7\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.733410 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "526e39ba-a8d3-4ac4-a90e-9867bc481a59" (UID: "526e39ba-a8d3-4ac4-a90e-9867bc481a59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:20 crc kubenswrapper[4696]: I1202 22:58:20.802949 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526e39ba-a8d3-4ac4-a90e-9867bc481a59-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.006494 4696 generic.go:334] "Generic (PLEG): container finished" podID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerID="e96b76601f201a7eb5341302a7c61383ecb13c8d171940c83e969c6375b67219" exitCode=0 Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.006617 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerDied","Data":"e96b76601f201a7eb5341302a7c61383ecb13c8d171940c83e969c6375b67219"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.011104 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" event={"ID":"9c1744f3-fc58-4653-a7e0-4fcdfdfca485","Type":"ContainerStarted","Data":"6514026fa128b10fa17a36b8bd66e206aeca81faaa855b758dc02aa11bf0191d"} Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.014835 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" podUID="9c1744f3-fc58-4653-a7e0-4fcdfdfca485" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.017808 4696 generic.go:334] "Generic (PLEG): container finished" podID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerID="79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a" exitCode=0 Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.017856 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerDied","Data":"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.017881 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b92fc" event={"ID":"526e39ba-a8d3-4ac4-a90e-9867bc481a59","Type":"ContainerDied","Data":"a5304c3c842d96c881ab66b4ca48a79b7c46268cf18301ca2e5ad688c45611d5"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.017900 4696 scope.go:117] "RemoveContainer" containerID="79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.018031 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b92fc" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.084651 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" event={"ID":"1beb3e53-4faf-475f-b5b0-57b8cd32c529","Type":"ContainerStarted","Data":"c7ab53037a212af4d0625847b7f3b3c0af324d878fd59e2435dff92c7532c4eb"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.087507 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.093491 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" event={"ID":"796c18e3-0c33-4393-aba8-2ad03aad4b93","Type":"ContainerStarted","Data":"85ccf634674bcd0230a99d94e1db9db2ea05c9ec947a3b0346049c242e443f66"} Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.096835 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" podUID="796c18e3-0c33-4393-aba8-2ad03aad4b93" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.097390 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" podUID="1beb3e53-4faf-475f-b5b0-57b8cd32c529" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.098300 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b92fc"] Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.107488 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" event={"ID":"d417ecee-aebb-4154-ac0c-2c321bd78182","Type":"ContainerStarted","Data":"c609edd35cecd032d5232290c49659b76f6e9515fddcf5e22520240fd7965e21"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.121342 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" event={"ID":"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e","Type":"ContainerStarted","Data":"36a3d81c737db664d4d186cc030376b1a7a05c916c921fbf22ead7b85ba432b1"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.125688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" event={"ID":"45f0f590-24f6-4f01-98a0-a41508a59f5a","Type":"ContainerStarted","Data":"55e9fb306228ca4d071927cedb4b7a617f2591f666e244823c2bca714807b02d"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.148370 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" event={"ID":"e53eb416-2701-4080-b0a3-bbeae35013a4","Type":"ContainerStarted","Data":"01eefcb96f132d57ec318962854af7e8ef6ea363da61bfe577216007defdde7d"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.155118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" event={"ID":"3d6a00c3-b537-414a-8ba4-2797d7bc88f8","Type":"ContainerStarted","Data":"37419b91bf936ac1746c4ea75088ea469ca0a09244945cc6666b2dc1c340a7bc"} Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.173255 4696 scope.go:117] "RemoveContainer" containerID="94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.179067 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" podUID="9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.179137 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" podUID="e53eb416-2701-4080-b0a3-bbeae35013a4" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.179137 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.45:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" podUID="d417ecee-aebb-4154-ac0c-2c321bd78182" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.180378 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" podUID="45f0f590-24f6-4f01-98a0-a41508a59f5a" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.210071 4696 scope.go:117] "RemoveContainer" containerID="d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.264942 4696 scope.go:117] "RemoveContainer" containerID="79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.266162 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a\": container with ID starting with 79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a not found: ID does not exist" containerID="79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.266201 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a"} err="failed to get container status \"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a\": rpc error: code = NotFound desc = could not find container \"79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a\": container with ID starting with 79217a48dfaa51d854e0d1f6286c16bbe09b4f4c9b0af8a8401cc082d3e22d6a not found: ID does not exist" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.266224 4696 scope.go:117] "RemoveContainer" containerID="94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.266728 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce\": container with ID starting with 94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce not found: ID does not exist" containerID="94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.266788 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce"} err="failed to get container status \"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce\": rpc error: code = NotFound desc = could not find container \"94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce\": container with ID starting with 94060245f3175ef5b2997ee75ce24bfbcb51c31b2b81435b7b5a0428422771ce not found: ID does not exist" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.266822 4696 scope.go:117] "RemoveContainer" containerID="d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.267360 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35\": container with ID starting with d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35 not found: ID does not exist" containerID="d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.267379 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35"} err="failed to get container status \"d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35\": rpc error: code = NotFound desc = could not find container \"d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35\": container with ID starting with d04003f8c88dbd05e8d79d7fb6d4314aa78e476d0c93764c68e7298fd1c56b35 not found: ID does not exist" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.445916 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" path="/var/lib/kubelet/pods/526e39ba-a8d3-4ac4-a90e-9867bc481a59/volumes" Dec 02 22:58:21 crc kubenswrapper[4696]: I1202 22:58:21.821857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.822281 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:21 crc kubenswrapper[4696]: E1202 22:58:21.822337 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert podName:66d51ef3-89ba-4653-ae46-5469bfc5232e nodeName:}" failed. No retries permitted until 2025-12-02 22:58:25.822321531 +0000 UTC m=+968.703001532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert") pod "infra-operator-controller-manager-57548d458d-kz6bs" (UID: "66d51ef3-89ba-4653-ae46-5469bfc5232e") : secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: I1202 22:58:22.131099 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.131296 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.131394 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert podName:6e335d65-9d0f-4ace-97cc-70a4a2bb2291 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:26.13137281 +0000 UTC m=+969.012052811 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" (UID: "6e335d65-9d0f-4ace-97cc-70a4a2bb2291") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.179369 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" podUID="e53eb416-2701-4080-b0a3-bbeae35013a4" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.180210 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" podUID="796c18e3-0c33-4393-aba8-2ad03aad4b93" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.180321 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" podUID="1beb3e53-4faf-475f-b5b0-57b8cd32c529" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.180368 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" podUID="9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.180391 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.45:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" podUID="d417ecee-aebb-4154-ac0c-2c321bd78182" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.180676 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" podUID="45f0f590-24f6-4f01-98a0-a41508a59f5a" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.181115 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" podUID="9c1744f3-fc58-4653-a7e0-4fcdfdfca485" Dec 02 22:58:22 crc kubenswrapper[4696]: I1202 22:58:22.537676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:22 crc kubenswrapper[4696]: I1202 22:58:22.537776 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.537867 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.537932 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.537964 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:26.537944979 +0000 UTC m=+969.418624980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "metrics-server-cert" not found Dec 02 22:58:22 crc kubenswrapper[4696]: E1202 22:58:22.537993 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:26.53797494 +0000 UTC m=+969.418654941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:25 crc kubenswrapper[4696]: I1202 22:58:25.903896 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:25 crc kubenswrapper[4696]: E1202 22:58:25.904690 4696 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:25 crc kubenswrapper[4696]: E1202 22:58:25.906566 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert podName:66d51ef3-89ba-4653-ae46-5469bfc5232e nodeName:}" failed. No retries permitted until 2025-12-02 22:58:33.906532166 +0000 UTC m=+976.787212177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert") pod "infra-operator-controller-manager-57548d458d-kz6bs" (UID: "66d51ef3-89ba-4653-ae46-5469bfc5232e") : secret "infra-operator-webhook-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: I1202 22:58:26.212729 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.212970 4696 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.213120 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert podName:6e335d65-9d0f-4ace-97cc-70a4a2bb2291 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:34.213084173 +0000 UTC m=+977.093764204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" (UID: "6e335d65-9d0f-4ace-97cc-70a4a2bb2291") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: I1202 22:58:26.619712 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:26 crc kubenswrapper[4696]: I1202 22:58:26.619799 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.619953 4696 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.620107 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:34.620073723 +0000 UTC m=+977.500753764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "metrics-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.620120 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:26 crc kubenswrapper[4696]: E1202 22:58:26.620259 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:34.620222058 +0000 UTC m=+977.500902099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.133076 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:58:32 crc kubenswrapper[4696]: E1202 22:58:32.134700 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="extract-utilities" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.134719 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="extract-utilities" Dec 02 22:58:32 crc kubenswrapper[4696]: E1202 22:58:32.134730 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="registry-server" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.134753 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="registry-server" Dec 02 22:58:32 crc kubenswrapper[4696]: E1202 22:58:32.134770 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="extract-content" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.134777 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="extract-content" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.134937 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="526e39ba-a8d3-4ac4-a90e-9867bc481a59" containerName="registry-server" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.136034 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.158684 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.221176 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x47w\" (UniqueName: \"kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.221586 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.221727 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.323447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.323978 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x47w\" (UniqueName: \"kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.324510 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.325107 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.324185 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.347210 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x47w\" (UniqueName: \"kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w\") pod \"redhat-marketplace-cx7qm\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:32 crc kubenswrapper[4696]: I1202 22:58:32.461688 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:58:33 crc kubenswrapper[4696]: I1202 22:58:33.433689 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 22:58:33 crc kubenswrapper[4696]: I1202 22:58:33.950195 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:33 crc kubenswrapper[4696]: I1202 22:58:33.959012 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d51ef3-89ba-4653-ae46-5469bfc5232e-cert\") pod \"infra-operator-controller-manager-57548d458d-kz6bs\" (UID: \"66d51ef3-89ba-4653-ae46-5469bfc5232e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.055205 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.255048 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.260160 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e335d65-9d0f-4ace-97cc-70a4a2bb2291-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg\" (UID: \"6e335d65-9d0f-4ace-97cc-70a4a2bb2291\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.542434 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.661990 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.662063 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:34 crc kubenswrapper[4696]: E1202 22:58:34.662311 4696 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 22:58:34 crc kubenswrapper[4696]: E1202 22:58:34.662406 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs podName:0508daaa-b26a-4f05-9abc-f63ac69fd1d5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:50.662379772 +0000 UTC m=+993.543059813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs") pod "openstack-operator-controller-manager-744c6b777f-bjtk5" (UID: "0508daaa-b26a-4f05-9abc-f63ac69fd1d5") : secret "webhook-server-cert" not found Dec 02 22:58:34 crc kubenswrapper[4696]: I1202 22:58:34.667888 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-metrics-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:35 crc kubenswrapper[4696]: E1202 22:58:35.858384 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 22:58:35 crc kubenswrapper[4696]: E1202 22:58:35.859352 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q2cqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-tp7td_openstack-operators(baad852a-374a-460e-9d5c-cb5418291849): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:36 crc kubenswrapper[4696]: E1202 22:58:36.721159 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 02 22:58:36 crc kubenswrapper[4696]: E1202 22:58:36.721408 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dglp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-s9pk4_openstack-operators(5315d589-3bb7-4776-b842-ffc18e1a89e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:42 crc kubenswrapper[4696]: E1202 22:58:42.005778 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 02 22:58:42 crc kubenswrapper[4696]: E1202 22:58:42.006963 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52ktm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-tzgnf_openstack-operators(5706d5c2-8bbe-40b3-8820-0d547363fa96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:42 crc kubenswrapper[4696]: E1202 22:58:42.940254 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 02 22:58:42 crc kubenswrapper[4696]: E1202 22:58:42.940616 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mfh8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-4crn9_openstack-operators(bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:43 crc kubenswrapper[4696]: E1202 22:58:43.808066 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 02 22:58:43 crc kubenswrapper[4696]: E1202 22:58:43.808338 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5j2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-rjlxh_openstack-operators(755f9574-a31b-430c-a2a2-92554020d96b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:44 crc kubenswrapper[4696]: E1202 22:58:44.544048 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 02 22:58:44 crc kubenswrapper[4696]: E1202 22:58:44.544336 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-llbzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-xxm5w_openstack-operators(351a13fb-8e8e-4393-adef-28523ab05ccb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:45 crc kubenswrapper[4696]: E1202 22:58:45.473327 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 02 22:58:45 crc kubenswrapper[4696]: E1202 22:58:45.474017 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnrd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-tng2n_openstack-operators(7d7b7caa-1ec3-4e66-9273-36cae02cbe8e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:58:50 crc kubenswrapper[4696]: I1202 22:58:50.664137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:50 crc kubenswrapper[4696]: I1202 22:58:50.677551 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0508daaa-b26a-4f05-9abc-f63ac69fd1d5-webhook-certs\") pod \"openstack-operator-controller-manager-744c6b777f-bjtk5\" (UID: \"0508daaa-b26a-4f05-9abc-f63ac69fd1d5\") " pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:50 crc kubenswrapper[4696]: I1202 22:58:50.912201 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:52 crc kubenswrapper[4696]: I1202 22:58:52.490994 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:58:52 crc kubenswrapper[4696]: I1202 22:58:52.636002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs"] Dec 02 22:58:52 crc kubenswrapper[4696]: I1202 22:58:52.769317 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg"] Dec 02 22:58:52 crc kubenswrapper[4696]: I1202 22:58:52.973540 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:58:52 crc kubenswrapper[4696]: I1202 22:58:52.973628 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:58:53 crc kubenswrapper[4696]: W1202 22:58:53.001484 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e335d65_9d0f_4ace_97cc_70a4a2bb2291.slice/crio-3e229c7725f2052a75795351cdb67be893d81eeef6513544fff6afd061286bff WatchSource:0}: Error finding container 3e229c7725f2052a75795351cdb67be893d81eeef6513544fff6afd061286bff: Status 404 returned error can't find the container with id 3e229c7725f2052a75795351cdb67be893d81eeef6513544fff6afd061286bff Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.282327 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5"] Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.460828 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" event={"ID":"66d51ef3-89ba-4653-ae46-5469bfc5232e","Type":"ContainerStarted","Data":"3ef44246e6b07c4a78970cd9a8dcb0a73f4195b4955321785db42566f4f2ce4f"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.463437 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" event={"ID":"3d6a00c3-b537-414a-8ba4-2797d7bc88f8","Type":"ContainerStarted","Data":"57060a84b4901d69149c420e9cba55f7f447055060d2211ebe64b1e5413d1370"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.467862 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" event={"ID":"060c8046-7775-413d-9797-ef0edcee01dd","Type":"ContainerStarted","Data":"18141b20924d60caa6e23495d853b26050d324c093116f11b732b41da4523b8f"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.469522 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerStarted","Data":"8d29cca22c436672ad65f0670eaad5c78de410e5fcb967059cb431c4f2141574"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.472144 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" event={"ID":"e693a226-52c3-413c-b607-c0050ab5e553","Type":"ContainerStarted","Data":"eca57bc5fd12e03fd531ffd283bbb32f42e349241083e544873fd59b7c261b10"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.480603 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" event={"ID":"77131fa7-a611-46bf-b0fe-d05d909dfd4c","Type":"ContainerStarted","Data":"c4b3ca8530945e03dcb86dc063d660714d7dec982cc86053a68c2b0d8fbe7043"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.484156 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerStarted","Data":"953f44a62f6475831e6ba96c9c674abf5a72e8e69d0c3564ead353e48a6930c8"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.486997 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" event={"ID":"0508daaa-b26a-4f05-9abc-f63ac69fd1d5","Type":"ContainerStarted","Data":"9a36b44ab04310e2765ae95375f60c889828d7050ad12da00f3b950f8e3d9ab2"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.490507 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" event={"ID":"6e335d65-9d0f-4ace-97cc-70a4a2bb2291","Type":"ContainerStarted","Data":"3e229c7725f2052a75795351cdb67be893d81eeef6513544fff6afd061286bff"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.492770 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" event={"ID":"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794","Type":"ContainerStarted","Data":"7796195d549b4630d9160671ffe88c0edbc6426f054474cad63d85ba33213195"} Dec 02 22:58:53 crc kubenswrapper[4696]: I1202 22:58:53.520876 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" event={"ID":"9207b2f0-999a-45e4-8234-982f796f7801","Type":"ContainerStarted","Data":"deaa88fc2e96a798a55060f7f1d12b33a7fb4e74c053e838520d4e4dba1b1932"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.539872 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" event={"ID":"9c1744f3-fc58-4653-a7e0-4fcdfdfca485","Type":"ContainerStarted","Data":"eb137adcc5679bfb5cd0b9c2ccd9621ca805d4001a6c2e18fa86386c12fa3ca3"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.542860 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerID="ea0cfbed3b80ff71434f7b59a6587ee97f4a08cc209a4bc70f10dbe1b662c9d9" exitCode=0 Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.543161 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerDied","Data":"ea0cfbed3b80ff71434f7b59a6587ee97f4a08cc209a4bc70f10dbe1b662c9d9"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.548143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" event={"ID":"1beb3e53-4faf-475f-b5b0-57b8cd32c529","Type":"ContainerStarted","Data":"45c216d5a13997c02d20fcadb3a40b88bd7cb845f893b87305e6324490218d48"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.551374 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" event={"ID":"796c18e3-0c33-4393-aba8-2ad03aad4b93","Type":"ContainerStarted","Data":"500ca8b8b01e32254bb5bd9b92362775464f5166aca28f5e9358ce5b82262eb6"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.559131 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" event={"ID":"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e","Type":"ContainerStarted","Data":"a5630550bd68869fa7bd681bf223ca2d68168074d8b46a983140d2b4132331e5"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.561645 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" event={"ID":"45f0f590-24f6-4f01-98a0-a41508a59f5a","Type":"ContainerStarted","Data":"d578953650eed666a03656073fe240c807f7a0d6bc3d31604b93d78c51042532"} Dec 02 22:58:54 crc kubenswrapper[4696]: I1202 22:58:54.587871 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sp4gt" podStartSLOduration=12.958328777 podStartE2EDuration="39.587828812s" podCreationTimestamp="2025-12-02 22:58:15 +0000 UTC" firstStartedPulling="2025-12-02 22:58:18.88171996 +0000 UTC m=+961.762399961" lastFinishedPulling="2025-12-02 22:58:45.511220005 +0000 UTC m=+988.391899996" observedRunningTime="2025-12-02 22:58:53.548364097 +0000 UTC m=+996.429044098" watchObservedRunningTime="2025-12-02 22:58:54.587828812 +0000 UTC m=+997.468508813" Dec 02 22:58:56 crc kubenswrapper[4696]: I1202 22:58:56.340067 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:56 crc kubenswrapper[4696]: I1202 22:58:56.341051 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:56 crc kubenswrapper[4696]: I1202 22:58:56.445194 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.600110 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" event={"ID":"e53eb416-2701-4080-b0a3-bbeae35013a4","Type":"ContainerStarted","Data":"88bdbad269dbc066a23910328e3d11554fe2e12157acc302596fd44c46cb05e4"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.607302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" event={"ID":"9207b2f0-999a-45e4-8234-982f796f7801","Type":"ContainerStarted","Data":"f82e2cd1044db2d1fd7c27412aab5b5fabac51f309de9f562b8da7211beaf4c0"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.608296 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.610773 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" event={"ID":"66d51ef3-89ba-4653-ae46-5469bfc5232e","Type":"ContainerStarted","Data":"31fb014e156e1e68e194559e4e02f12ad93cb2aeabcadc0eeb5952df9949937a"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.612542 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.624059 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b2wbs" podStartSLOduration=7.622697609 podStartE2EDuration="40.624034446s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.970117039 +0000 UTC m=+962.850797040" lastFinishedPulling="2025-12-02 22:58:52.971453876 +0000 UTC m=+995.852133877" observedRunningTime="2025-12-02 22:58:58.618995464 +0000 UTC m=+1001.499675455" watchObservedRunningTime="2025-12-02 22:58:58.624034446 +0000 UTC m=+1001.504714448" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.633490 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" event={"ID":"77131fa7-a611-46bf-b0fe-d05d909dfd4c","Type":"ContainerStarted","Data":"f68270ebf91079399734e87f27558a20f96497c3523d941a3ffa023f376750d4"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.634402 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.638523 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.666393 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" event={"ID":"0508daaa-b26a-4f05-9abc-f63ac69fd1d5","Type":"ContainerStarted","Data":"206e446bdad66ea97dc20256c85ad5ecec70f6f5aae584bd2749d8af92bbceea"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.666513 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.674731 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pwblb" podStartSLOduration=3.376864792 podStartE2EDuration="41.674718251s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.697288476 +0000 UTC m=+962.577968477" lastFinishedPulling="2025-12-02 22:58:57.995141935 +0000 UTC m=+1000.875821936" observedRunningTime="2025-12-02 22:58:58.670730738 +0000 UTC m=+1001.551410739" watchObservedRunningTime="2025-12-02 22:58:58.674718251 +0000 UTC m=+1001.555398242" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.676595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" event={"ID":"d417ecee-aebb-4154-ac0c-2c321bd78182","Type":"ContainerStarted","Data":"3d7e841a373f59cb02fcb5025a26284ab2f5b0d048376339ca06ce93b806448b"} Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.676848 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-vg9kf" podStartSLOduration=3.349785276 podStartE2EDuration="41.676841111s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.71932774 +0000 UTC m=+962.600007741" lastFinishedPulling="2025-12-02 22:58:58.046383545 +0000 UTC m=+1000.927063576" observedRunningTime="2025-12-02 22:58:58.645909086 +0000 UTC m=+1001.526589087" watchObservedRunningTime="2025-12-02 22:58:58.676841111 +0000 UTC m=+1001.557521112" Dec 02 22:58:58 crc kubenswrapper[4696]: I1202 22:58:58.697728 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" event={"ID":"6e335d65-9d0f-4ace-97cc-70a4a2bb2291","Type":"ContainerStarted","Data":"3fd670a7acaf2631e63bff2eedb5095ebc1d96654fd16088b4b1f4cf94785ef4"} Dec 02 22:58:58 crc kubenswrapper[4696]: E1202 22:58:58.850346 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" podUID="755f9574-a31b-430c-a2a2-92554020d96b" Dec 02 22:58:58 crc kubenswrapper[4696]: E1202 22:58:58.902945 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" podUID="5315d589-3bb7-4776-b842-ffc18e1a89e1" Dec 02 22:58:58 crc kubenswrapper[4696]: E1202 22:58:58.990267 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" podUID="bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094" Dec 02 22:58:59 crc kubenswrapper[4696]: E1202 22:58:59.112350 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" podUID="5706d5c2-8bbe-40b3-8820-0d547363fa96" Dec 02 22:58:59 crc kubenswrapper[4696]: E1202 22:58:59.570190 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" podUID="baad852a-374a-460e-9d5c-cb5418291849" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.706816 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerID="f77f2170046220130e6d402ed1d87f12c6aa354c2ca785ddc77e9af8bae27573" exitCode=0 Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.706954 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerDied","Data":"f77f2170046220130e6d402ed1d87f12c6aa354c2ca785ddc77e9af8bae27573"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.712082 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" event={"ID":"3d6a00c3-b537-414a-8ba4-2797d7bc88f8","Type":"ContainerStarted","Data":"c360ff0fba5c50229c81d9d537c839f26ce9bb52cbd50ec1a7a27ece00a23a03"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.712511 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.723873 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" event={"ID":"d417ecee-aebb-4154-ac0c-2c321bd78182","Type":"ContainerStarted","Data":"19a3cb349928af8ecad5151ac219acd83c460ae5d19621e5f0f9946f73e1828d"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.724128 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.724258 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.735334 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" event={"ID":"5706d5c2-8bbe-40b3-8820-0d547363fa96","Type":"ContainerStarted","Data":"722475f461f3bbda2f1ff82492b5415d0e3afdd3e053502cb1cf27d3fecac4ae"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.750515 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" event={"ID":"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094","Type":"ContainerStarted","Data":"eada39d24d7bb928c2cea3b1274fc6d686d387503881785a23859368afd340ce"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.756490 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" event={"ID":"755f9574-a31b-430c-a2a2-92554020d96b","Type":"ContainerStarted","Data":"234893de741527212b1532e9a94a108f474bf3ca6cae972aee36762b53dec3b9"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.768650 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" event={"ID":"5315d589-3bb7-4776-b842-ffc18e1a89e1","Type":"ContainerStarted","Data":"0d49296afd49cf8f516447c3155de5dbd29b0c39f61140e24e8ce0b7a0fc397d"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.769582 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" podStartSLOduration=41.769564194 podStartE2EDuration="41.769564194s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:58:58.71637087 +0000 UTC m=+1001.597050871" watchObservedRunningTime="2025-12-02 22:58:59.769564194 +0000 UTC m=+1002.650244195" Dec 02 22:58:59 crc kubenswrapper[4696]: E1202 22:58:59.775420 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" podUID="7d7b7caa-1ec3-4e66-9273-36cae02cbe8e" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.789259 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" event={"ID":"baad852a-374a-460e-9d5c-cb5418291849","Type":"ContainerStarted","Data":"afc291c71aa4ebe76ce2764f7eea07288b47c209857919f7a02f990052305713"} Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.819665 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-g5pxw" podStartSLOduration=4.531511878 podStartE2EDuration="42.819629101s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.932492604 +0000 UTC m=+962.813172605" lastFinishedPulling="2025-12-02 22:58:58.220609827 +0000 UTC m=+1001.101289828" observedRunningTime="2025-12-02 22:58:59.80579732 +0000 UTC m=+1002.686477321" watchObservedRunningTime="2025-12-02 22:58:59.819629101 +0000 UTC m=+1002.700309112" Dec 02 22:58:59 crc kubenswrapper[4696]: I1202 22:58:59.859388 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" podStartSLOduration=8.787565804 podStartE2EDuration="41.859361706s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.944255167 +0000 UTC m=+962.824935168" lastFinishedPulling="2025-12-02 22:58:53.016051069 +0000 UTC m=+995.896731070" observedRunningTime="2025-12-02 22:58:59.845327099 +0000 UTC m=+1002.726007100" watchObservedRunningTime="2025-12-02 22:58:59.859361706 +0000 UTC m=+1002.740041707" Dec 02 22:59:00 crc kubenswrapper[4696]: E1202 22:59:00.005381 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" podUID="351a13fb-8e8e-4393-adef-28523ab05ccb" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.799978 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerStarted","Data":"c1dc391746b753d5fce8297ce4f880dc75cef5d71831796dc19d50f3a91cceb7"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.802912 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" event={"ID":"45f0f590-24f6-4f01-98a0-a41508a59f5a","Type":"ContainerStarted","Data":"1f48f6abd777c373ac927f4ffbe3c2d08671175e2d6161998fc43b53ed52eab7"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.803166 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.806750 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" event={"ID":"9c1744f3-fc58-4653-a7e0-4fcdfdfca485","Type":"ContainerStarted","Data":"d22df5d18546b18547fe7f4e87ccb4e655d49e0701dfd086c44aa5cf792eb631"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.807442 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.808267 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.810826 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.810982 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" event={"ID":"bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094","Type":"ContainerStarted","Data":"d6270b279b3ec21575bd0f6b27fd16a4b43e256c7a00e022a5464014402048de"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.811077 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.815186 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" event={"ID":"755f9574-a31b-430c-a2a2-92554020d96b","Type":"ContainerStarted","Data":"6e5cd310f831ce0800030aa9a5975dfc0c49ba125bd5b50ae2ba7d9379bfa970"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.815325 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.817697 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" event={"ID":"66d51ef3-89ba-4653-ae46-5469bfc5232e","Type":"ContainerStarted","Data":"3f9aa85b2a3fec5926000ea58bf525e39cd4e45f1be929d40ad44ed35373b062"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.817832 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.824424 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" event={"ID":"796c18e3-0c33-4393-aba8-2ad03aad4b93","Type":"ContainerStarted","Data":"8e1b974e6ec567bddc20dad8f661943a33b7bfe5bbbf0ce5f3be90e19daf6ef9"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.826976 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.828880 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.829071 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cx7qm" podStartSLOduration=23.13105357 podStartE2EDuration="28.829045886s" podCreationTimestamp="2025-12-02 22:58:32 +0000 UTC" firstStartedPulling="2025-12-02 22:58:54.630832179 +0000 UTC m=+997.511512181" lastFinishedPulling="2025-12-02 22:59:00.328824496 +0000 UTC m=+1003.209504497" observedRunningTime="2025-12-02 22:59:00.827246115 +0000 UTC m=+1003.707926116" watchObservedRunningTime="2025-12-02 22:59:00.829045886 +0000 UTC m=+1003.709725887" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.844141 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" event={"ID":"9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e","Type":"ContainerStarted","Data":"db91a3d0042b76cb76206126ffc6c5b112abc28cdb28119390dac0e762b98d2e"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.844890 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.847285 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" event={"ID":"5436ce3c-34d6-47eb-81b1-3b4dc1c2d794","Type":"ContainerStarted","Data":"38096b08b2e19aca51fe3aa15cd1cd617ae02d147d86b36f796da31f3da9bce8"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.847659 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.848680 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" event={"ID":"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e","Type":"ContainerStarted","Data":"e9fcf9a6515a7ec5db8b80cacb7127a3e5589f69d331152e6347702395e79839"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.853214 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.853527 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.854924 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" event={"ID":"baad852a-374a-460e-9d5c-cb5418291849","Type":"ContainerStarted","Data":"8e3dc7d3affb4d41df150e076f22743c0851bb84ec76b628f10a6f3f8983c4f6"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.855218 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.864354 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" event={"ID":"5706d5c2-8bbe-40b3-8820-0d547363fa96","Type":"ContainerStarted","Data":"af1d8a2632e1b162c0dea5b58eb1fe2bb260ecf2b91f5fe1fd89caf7077f913c"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.864968 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.865102 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" podStartSLOduration=3.332209359 podStartE2EDuration="43.865080516s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.809904874 +0000 UTC m=+962.690584875" lastFinishedPulling="2025-12-02 22:59:00.342776041 +0000 UTC m=+1003.223456032" observedRunningTime="2025-12-02 22:59:00.854585729 +0000 UTC m=+1003.735265730" watchObservedRunningTime="2025-12-02 22:59:00.865080516 +0000 UTC m=+1003.745760517" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.868040 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" event={"ID":"1beb3e53-4faf-475f-b5b0-57b8cd32c529","Type":"ContainerStarted","Data":"9233b8f5acb8b992d92c2c3136c033002a79901cfa91d325db6123f644e5ce41"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.868263 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.876252 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.882597 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" event={"ID":"e693a226-52c3-413c-b607-c0050ab5e553","Type":"ContainerStarted","Data":"fb67b508b6acf443504a1178aff751f8875e850eb9e13141bc1623f4afdf9c42"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.883438 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.886138 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" podStartSLOduration=2.7421032739999998 podStartE2EDuration="43.886116261s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.222159446 +0000 UTC m=+962.102839447" lastFinishedPulling="2025-12-02 22:59:00.366172433 +0000 UTC m=+1003.246852434" observedRunningTime="2025-12-02 22:59:00.877136007 +0000 UTC m=+1003.757816008" watchObservedRunningTime="2025-12-02 22:59:00.886116261 +0000 UTC m=+1003.766796252" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.888553 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" event={"ID":"060c8046-7775-413d-9797-ef0edcee01dd","Type":"ContainerStarted","Data":"372de029967d8f7d904a468a5a060e30da1fb1a24934a180481c0e2d138ee9a8"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.889871 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.894122 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.894241 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.905619 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-58fzc" podStartSLOduration=4.621492073 podStartE2EDuration="42.905605823s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.970113829 +0000 UTC m=+962.850793820" lastFinishedPulling="2025-12-02 22:58:58.254227569 +0000 UTC m=+1001.134907570" observedRunningTime="2025-12-02 22:59:00.904280145 +0000 UTC m=+1003.784960146" watchObservedRunningTime="2025-12-02 22:59:00.905605823 +0000 UTC m=+1003.786285824" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.907003 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" event={"ID":"6e335d65-9d0f-4ace-97cc-70a4a2bb2291","Type":"ContainerStarted","Data":"0e499705ac5a4cfedfdc58c16c26203c48685888b3e5a047a291333fdae76dbc"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.907054 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.917185 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" event={"ID":"351a13fb-8e8e-4393-adef-28523ab05ccb","Type":"ContainerStarted","Data":"48e39c6c5a7491719e04655c5a56a379edaf6b58d7548d27133bbb2dd7835a47"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.930831 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" event={"ID":"5315d589-3bb7-4776-b842-ffc18e1a89e1","Type":"ContainerStarted","Data":"61cc8a9640ec27ee1b236907d4ce5d839e51fae1768784c17d0a62953dac82ad"} Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.946232 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" podStartSLOduration=39.032883448 podStartE2EDuration="43.946208082s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:53.021821182 +0000 UTC m=+995.902501173" lastFinishedPulling="2025-12-02 22:58:57.935145806 +0000 UTC m=+1000.815825807" observedRunningTime="2025-12-02 22:59:00.926083473 +0000 UTC m=+1003.806763474" watchObservedRunningTime="2025-12-02 22:59:00.946208082 +0000 UTC m=+1003.826888083" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.977398 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-hfcc6" podStartSLOduration=5.71569666 podStartE2EDuration="43.977374175s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.949763203 +0000 UTC m=+962.830443204" lastFinishedPulling="2025-12-02 22:58:58.211440718 +0000 UTC m=+1001.092120719" observedRunningTime="2025-12-02 22:59:00.964388437 +0000 UTC m=+1003.845068438" watchObservedRunningTime="2025-12-02 22:59:00.977374175 +0000 UTC m=+1003.858054176" Dec 02 22:59:00 crc kubenswrapper[4696]: I1202 22:59:00.999393 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-jfcsj" podStartSLOduration=4.714347111 podStartE2EDuration="42.999367147s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.97015879 +0000 UTC m=+962.850838791" lastFinishedPulling="2025-12-02 22:58:58.255178826 +0000 UTC m=+1001.135858827" observedRunningTime="2025-12-02 22:59:00.996091884 +0000 UTC m=+1003.876771885" watchObservedRunningTime="2025-12-02 22:59:00.999367147 +0000 UTC m=+1003.880047158" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.039071 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" podStartSLOduration=39.087755471 podStartE2EDuration="44.03904567s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:53.006675493 +0000 UTC m=+995.887355494" lastFinishedPulling="2025-12-02 22:58:57.957965692 +0000 UTC m=+1000.838645693" observedRunningTime="2025-12-02 22:59:01.028960885 +0000 UTC m=+1003.909640886" watchObservedRunningTime="2025-12-02 22:59:01.03904567 +0000 UTC m=+1003.919725671" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.126641 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" podStartSLOduration=2.9999856339999997 podStartE2EDuration="44.126613919s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.218496812 +0000 UTC m=+962.099176813" lastFinishedPulling="2025-12-02 22:59:00.345125097 +0000 UTC m=+1003.225805098" observedRunningTime="2025-12-02 22:59:01.126489746 +0000 UTC m=+1004.007169747" watchObservedRunningTime="2025-12-02 22:59:01.126613919 +0000 UTC m=+1004.007293920" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.151192 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d6qww" podStartSLOduration=5.839937216 podStartE2EDuration="44.151169784s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.94189452 +0000 UTC m=+962.822574521" lastFinishedPulling="2025-12-02 22:58:58.253127088 +0000 UTC m=+1001.133807089" observedRunningTime="2025-12-02 22:59:01.147223903 +0000 UTC m=+1004.027903904" watchObservedRunningTime="2025-12-02 22:59:01.151169784 +0000 UTC m=+1004.031849785" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.220396 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bpk25" podStartSLOduration=5.91390452 podStartE2EDuration="44.220372123s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.907437165 +0000 UTC m=+962.788117166" lastFinishedPulling="2025-12-02 22:58:58.213904768 +0000 UTC m=+1001.094584769" observedRunningTime="2025-12-02 22:59:01.184144598 +0000 UTC m=+1004.064824599" watchObservedRunningTime="2025-12-02 22:59:01.220372123 +0000 UTC m=+1004.101052124" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.223390 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvqf8" podStartSLOduration=5.188355722 podStartE2EDuration="44.223378668s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.213068009 +0000 UTC m=+962.093748010" lastFinishedPulling="2025-12-02 22:58:58.248090955 +0000 UTC m=+1001.128770956" observedRunningTime="2025-12-02 22:59:01.221575447 +0000 UTC m=+1004.102255438" watchObservedRunningTime="2025-12-02 22:59:01.223378668 +0000 UTC m=+1004.104058659" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.260085 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" podStartSLOduration=3.325579652 podStartE2EDuration="44.260064727s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.431699988 +0000 UTC m=+962.312379989" lastFinishedPulling="2025-12-02 22:59:00.366185063 +0000 UTC m=+1003.246865064" observedRunningTime="2025-12-02 22:59:01.244767924 +0000 UTC m=+1004.125447925" watchObservedRunningTime="2025-12-02 22:59:01.260064727 +0000 UTC m=+1004.140744718" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.272646 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-m472w" podStartSLOduration=4.967504438 podStartE2EDuration="43.272626492s" podCreationTimestamp="2025-12-02 22:58:18 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.947234852 +0000 UTC m=+962.827914853" lastFinishedPulling="2025-12-02 22:58:58.252356906 +0000 UTC m=+1001.133036907" observedRunningTime="2025-12-02 22:59:01.269330349 +0000 UTC m=+1004.150010350" watchObservedRunningTime="2025-12-02 22:59:01.272626492 +0000 UTC m=+1004.153306493" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.324490 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k2p9j" podStartSLOduration=5.781213904 podStartE2EDuration="44.32446226s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.696987287 +0000 UTC m=+962.577667288" lastFinishedPulling="2025-12-02 22:58:58.240235643 +0000 UTC m=+1001.120915644" observedRunningTime="2025-12-02 22:59:01.320635922 +0000 UTC m=+1004.201315923" watchObservedRunningTime="2025-12-02 22:59:01.32446226 +0000 UTC m=+1004.205142261" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.368952 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" podStartSLOduration=3.738178002 podStartE2EDuration="44.368928239s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.714349269 +0000 UTC m=+962.595029270" lastFinishedPulling="2025-12-02 22:59:00.345099506 +0000 UTC m=+1003.225779507" observedRunningTime="2025-12-02 22:59:01.354098489 +0000 UTC m=+1004.234778490" watchObservedRunningTime="2025-12-02 22:59:01.368928239 +0000 UTC m=+1004.249608240" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.941574 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" event={"ID":"351a13fb-8e8e-4393-adef-28523ab05ccb","Type":"ContainerStarted","Data":"33dddba6d8cc18bd0f0a23903101075c6382f383b505cca73cfa4c0ac6c0c259"} Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.942203 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.944602 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" event={"ID":"7d7b7caa-1ec3-4e66-9273-36cae02cbe8e","Type":"ContainerStarted","Data":"0e0a1445818532eee3333dbb8c6671c527efe1cb98629d64716c7142d29eb25a"} Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.944656 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.947418 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.963797 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" podStartSLOduration=2.845493771 podStartE2EDuration="44.963771357s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.218696878 +0000 UTC m=+962.099376879" lastFinishedPulling="2025-12-02 22:59:01.336974464 +0000 UTC m=+1004.217654465" observedRunningTime="2025-12-02 22:59:01.96069406 +0000 UTC m=+1004.841374061" watchObservedRunningTime="2025-12-02 22:59:01.963771357 +0000 UTC m=+1004.844451358" Dec 02 22:59:01 crc kubenswrapper[4696]: I1202 22:59:01.989813 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" podStartSLOduration=3.4575011460000002 podStartE2EDuration="44.989789854s" podCreationTimestamp="2025-12-02 22:58:17 +0000 UTC" firstStartedPulling="2025-12-02 22:58:19.709839191 +0000 UTC m=+962.590519192" lastFinishedPulling="2025-12-02 22:59:01.242127899 +0000 UTC m=+1004.122807900" observedRunningTime="2025-12-02 22:59:01.989211407 +0000 UTC m=+1004.869891408" watchObservedRunningTime="2025-12-02 22:59:01.989789854 +0000 UTC m=+1004.870469855" Dec 02 22:59:02 crc kubenswrapper[4696]: I1202 22:59:02.461925 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:02 crc kubenswrapper[4696]: I1202 22:59:02.461978 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:02 crc kubenswrapper[4696]: I1202 22:59:02.516949 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:04 crc kubenswrapper[4696]: I1202 22:59:04.067981 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kz6bs" Dec 02 22:59:04 crc kubenswrapper[4696]: I1202 22:59:04.549131 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg" Dec 02 22:59:06 crc kubenswrapper[4696]: I1202 22:59:06.410514 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:59:06 crc kubenswrapper[4696]: I1202 22:59:06.479548 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:59:06 crc kubenswrapper[4696]: I1202 22:59:06.992941 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sp4gt" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="registry-server" containerID="cri-o://953f44a62f6475831e6ba96c9c674abf5a72e8e69d0c3564ead353e48a6930c8" gracePeriod=2 Dec 02 22:59:07 crc kubenswrapper[4696]: I1202 22:59:07.955145 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xxm5w" Dec 02 22:59:07 crc kubenswrapper[4696]: I1202 22:59:07.967080 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rjlxh" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.040494 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-tzgnf" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.087457 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s9pk4" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.376389 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tng2n" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.447722 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4crn9" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.502189 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tp7td" Dec 02 22:59:08 crc kubenswrapper[4696]: I1202 22:59:08.985512 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-d4477bdf4-lxz2l" Dec 02 22:59:09 crc kubenswrapper[4696]: I1202 22:59:09.052389 4696 generic.go:334] "Generic (PLEG): container finished" podID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerID="953f44a62f6475831e6ba96c9c674abf5a72e8e69d0c3564ead353e48a6930c8" exitCode=0 Dec 02 22:59:09 crc kubenswrapper[4696]: I1202 22:59:09.052482 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerDied","Data":"953f44a62f6475831e6ba96c9c674abf5a72e8e69d0c3564ead353e48a6930c8"} Dec 02 22:59:10 crc kubenswrapper[4696]: I1202 22:59:10.919862 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-744c6b777f-bjtk5" Dec 02 22:59:12 crc kubenswrapper[4696]: I1202 22:59:12.528310 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:12 crc kubenswrapper[4696]: I1202 22:59:12.586096 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:59:13 crc kubenswrapper[4696]: I1202 22:59:13.095096 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cx7qm" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="registry-server" containerID="cri-o://c1dc391746b753d5fce8297ce4f880dc75cef5d71831796dc19d50f3a91cceb7" gracePeriod=2 Dec 02 22:59:14 crc kubenswrapper[4696]: I1202 22:59:14.966396 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.102642 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp994\" (UniqueName: \"kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994\") pod \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.102794 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content\") pod \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.102934 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities\") pod \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\" (UID: \"3dcedd0c-cb54-4bc5-8c69-9506424a0b91\") " Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.104147 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities" (OuterVolumeSpecName: "utilities") pod "3dcedd0c-cb54-4bc5-8c69-9506424a0b91" (UID: "3dcedd0c-cb54-4bc5-8c69-9506424a0b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.113556 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerID="c1dc391746b753d5fce8297ce4f880dc75cef5d71831796dc19d50f3a91cceb7" exitCode=0 Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.113629 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerDied","Data":"c1dc391746b753d5fce8297ce4f880dc75cef5d71831796dc19d50f3a91cceb7"} Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.113831 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994" (OuterVolumeSpecName: "kube-api-access-pp994") pod "3dcedd0c-cb54-4bc5-8c69-9506424a0b91" (UID: "3dcedd0c-cb54-4bc5-8c69-9506424a0b91"). InnerVolumeSpecName "kube-api-access-pp994". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.116460 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp4gt" event={"ID":"3dcedd0c-cb54-4bc5-8c69-9506424a0b91","Type":"ContainerDied","Data":"b862c0816a6b1fb505663b78545dfe42015a3ecb93d9a201178083438a623556"} Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.116529 4696 scope.go:117] "RemoveContainer" containerID="953f44a62f6475831e6ba96c9c674abf5a72e8e69d0c3564ead353e48a6930c8" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.116579 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp4gt" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.163541 4696 scope.go:117] "RemoveContainer" containerID="e96b76601f201a7eb5341302a7c61383ecb13c8d171940c83e969c6375b67219" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.166248 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dcedd0c-cb54-4bc5-8c69-9506424a0b91" (UID: "3dcedd0c-cb54-4bc5-8c69-9506424a0b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.196632 4696 scope.go:117] "RemoveContainer" containerID="01cf7f435d069c3f2b220a7879e313771a3c425c388bc13671f98b080b295bd8" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.204806 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.204904 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp994\" (UniqueName: \"kubernetes.io/projected/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-kube-api-access-pp994\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.204928 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcedd0c-cb54-4bc5-8c69-9506424a0b91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.468966 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:59:15 crc kubenswrapper[4696]: I1202 22:59:15.480577 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sp4gt"] Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.482653 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.527992 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x47w\" (UniqueName: \"kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w\") pod \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.528150 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content\") pod \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.528207 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities\") pod \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\" (UID: \"8f0ab939-d26c-42ca-972a-92e1ddbe4a60\") " Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.529287 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities" (OuterVolumeSpecName: "utilities") pod "8f0ab939-d26c-42ca-972a-92e1ddbe4a60" (UID: "8f0ab939-d26c-42ca-972a-92e1ddbe4a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.537239 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w" (OuterVolumeSpecName: "kube-api-access-6x47w") pod "8f0ab939-d26c-42ca-972a-92e1ddbe4a60" (UID: "8f0ab939-d26c-42ca-972a-92e1ddbe4a60"). InnerVolumeSpecName "kube-api-access-6x47w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.549364 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f0ab939-d26c-42ca-972a-92e1ddbe4a60" (UID: "8f0ab939-d26c-42ca-972a-92e1ddbe4a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.630668 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x47w\" (UniqueName: \"kubernetes.io/projected/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-kube-api-access-6x47w\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.630727 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:16 crc kubenswrapper[4696]: I1202 22:59:16.630767 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0ab939-d26c-42ca-972a-92e1ddbe4a60-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.140718 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx7qm" event={"ID":"8f0ab939-d26c-42ca-972a-92e1ddbe4a60","Type":"ContainerDied","Data":"8d29cca22c436672ad65f0670eaad5c78de410e5fcb967059cb431c4f2141574"} Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.141004 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx7qm" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.141467 4696 scope.go:117] "RemoveContainer" containerID="c1dc391746b753d5fce8297ce4f880dc75cef5d71831796dc19d50f3a91cceb7" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.180446 4696 scope.go:117] "RemoveContainer" containerID="f77f2170046220130e6d402ed1d87f12c6aa354c2ca785ddc77e9af8bae27573" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.188463 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.195955 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx7qm"] Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.217148 4696 scope.go:117] "RemoveContainer" containerID="ea0cfbed3b80ff71434f7b59a6587ee97f4a08cc209a4bc70f10dbe1b662c9d9" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.453498 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" path="/var/lib/kubelet/pods/3dcedd0c-cb54-4bc5-8c69-9506424a0b91/volumes" Dec 02 22:59:17 crc kubenswrapper[4696]: I1202 22:59:17.454409 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" path="/var/lib/kubelet/pods/8f0ab939-d26c-42ca-972a-92e1ddbe4a60/volumes" Dec 02 22:59:22 crc kubenswrapper[4696]: I1202 22:59:22.974115 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:59:22 crc kubenswrapper[4696]: I1202 22:59:22.975106 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.218546 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230345 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="extract-content" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230370 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="extract-content" Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230387 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="extract-content" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230393 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="extract-content" Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230418 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230424 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230447 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="extract-utilities" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230453 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="extract-utilities" Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230469 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="extract-utilities" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230478 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="extract-utilities" Dec 02 22:59:29 crc kubenswrapper[4696]: E1202 22:59:29.230491 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230497 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230679 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcedd0c-cb54-4bc5-8c69-9506424a0b91" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.230694 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0ab939-d26c-42ca-972a-92e1ddbe4a60" containerName="registry-server" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.231623 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.238284 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-d9rxn" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.238688 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.238871 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.239000 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.245097 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.296750 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.298218 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.300398 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.303672 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.357359 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv57x\" (UniqueName: \"kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.357423 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.357629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756sc\" (UniqueName: \"kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.357800 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.359023 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.460506 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.460618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv57x\" (UniqueName: \"kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.460645 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.460674 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756sc\" (UniqueName: \"kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.460705 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.461620 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.461650 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.462173 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.494860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv57x\" (UniqueName: \"kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x\") pod \"dnsmasq-dns-675f4bcbfc-2bftf\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.495742 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756sc\" (UniqueName: \"kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc\") pod \"dnsmasq-dns-78dd6ddcc-dcqvm\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.556055 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 22:59:29 crc kubenswrapper[4696]: I1202 22:59:29.617044 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 22:59:30 crc kubenswrapper[4696]: I1202 22:59:30.124208 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 22:59:30 crc kubenswrapper[4696]: W1202 22:59:30.134529 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5273a22_1865_4f93_ae2f_cc7046c708cd.slice/crio-6159af933813b8732d03d45cade9197aaacf662a2f2a069c6cbb802120aefae3 WatchSource:0}: Error finding container 6159af933813b8732d03d45cade9197aaacf662a2f2a069c6cbb802120aefae3: Status 404 returned error can't find the container with id 6159af933813b8732d03d45cade9197aaacf662a2f2a069c6cbb802120aefae3 Dec 02 22:59:30 crc kubenswrapper[4696]: W1202 22:59:30.203969 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd245d496_d80f_4455_a66e_cc788aff5b35.slice/crio-f6e9678be081afab8b58067fe8bb81b12f26d93289411399695fd2149a0f1dab WatchSource:0}: Error finding container f6e9678be081afab8b58067fe8bb81b12f26d93289411399695fd2149a0f1dab: Status 404 returned error can't find the container with id f6e9678be081afab8b58067fe8bb81b12f26d93289411399695fd2149a0f1dab Dec 02 22:59:30 crc kubenswrapper[4696]: I1202 22:59:30.205575 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 22:59:30 crc kubenswrapper[4696]: I1202 22:59:30.271121 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" event={"ID":"e5273a22-1865-4f93-ae2f-cc7046c708cd","Type":"ContainerStarted","Data":"6159af933813b8732d03d45cade9197aaacf662a2f2a069c6cbb802120aefae3"} Dec 02 22:59:30 crc kubenswrapper[4696]: I1202 22:59:30.273066 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" event={"ID":"d245d496-d80f-4455-a66e-cc788aff5b35","Type":"ContainerStarted","Data":"f6e9678be081afab8b58067fe8bb81b12f26d93289411399695fd2149a0f1dab"} Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.110773 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.152580 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.154230 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.167914 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.231538 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t44s2\" (UniqueName: \"kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.231657 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.231731 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.333782 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.335724 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.335942 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t44s2\" (UniqueName: \"kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.337806 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.338224 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.362579 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t44s2\" (UniqueName: \"kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2\") pod \"dnsmasq-dns-666b6646f7-vrq4f\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.497486 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.506448 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.524631 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.529420 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.550721 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.648914 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.649064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.649123 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrg2t\" (UniqueName: \"kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.750943 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.751810 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.753044 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.753145 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.753215 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrg2t\" (UniqueName: \"kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.820594 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrg2t\" (UniqueName: \"kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t\") pod \"dnsmasq-dns-57d769cc4f-njzlw\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:32 crc kubenswrapper[4696]: I1202 22:59:32.909120 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.255842 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.324439 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.333904 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.334372 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.342996 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.343524 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.343592 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.343991 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.344123 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.344165 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nd7wm" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.346225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" event={"ID":"058142c0-4700-4160-857b-0b016c768a72","Type":"ContainerStarted","Data":"9c0c84a318a1782facba5e183682c37b84fc298b3240b08de16f340c6aef4d96"} Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.349176 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468582 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468641 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468668 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468695 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468718 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468740 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vn9k\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468770 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468804 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468835 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468857 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.468880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.535887 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 22:59:33 crc kubenswrapper[4696]: W1202 22:59:33.557317 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62dbb214_98c9_4d52_884b_45804bcc612c.slice/crio-0785993c643c6e8b5fba249ca02dc471648602e7cc0718997c84e526fdf79fb6 WatchSource:0}: Error finding container 0785993c643c6e8b5fba249ca02dc471648602e7cc0718997c84e526fdf79fb6: Status 404 returned error can't find the container with id 0785993c643c6e8b5fba249ca02dc471648602e7cc0718997c84e526fdf79fb6 Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.570926 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.570985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571024 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571059 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vn9k\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571113 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571211 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571287 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.571364 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.572005 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.572654 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.573348 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.573573 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.575930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.576485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.579849 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.594035 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.594171 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.594406 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.597210 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vn9k\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.623868 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.662618 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.738999 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.743179 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.748971 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.749120 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.749532 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.749787 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.749956 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.750116 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.750438 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wr8xb" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.752615 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774403 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774477 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774518 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774552 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774607 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774628 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24wt8\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.774664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.776617 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.776641 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.776686 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.776708 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881197 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881293 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881345 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881366 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24wt8\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881392 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881470 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881487 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881522 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.881581 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.883147 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.883510 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.885677 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.885944 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.886308 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.888511 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.894628 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.896098 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.905331 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.905889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24wt8\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.906009 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:33 crc kubenswrapper[4696]: I1202 22:59:33.923220 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:34 crc kubenswrapper[4696]: I1202 22:59:34.091725 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 22:59:34 crc kubenswrapper[4696]: I1202 22:59:34.283581 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 22:59:34 crc kubenswrapper[4696]: I1202 22:59:34.369989 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerStarted","Data":"2b76c58fbc961bfd186687cd4e13818ca91c224d9b75f3a495116953a54b9ffc"} Dec 02 22:59:34 crc kubenswrapper[4696]: I1202 22:59:34.379529 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" event={"ID":"62dbb214-98c9-4d52-884b-45804bcc612c","Type":"ContainerStarted","Data":"0785993c643c6e8b5fba249ca02dc471648602e7cc0718997c84e526fdf79fb6"} Dec 02 22:59:34 crc kubenswrapper[4696]: I1202 22:59:34.730096 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.004546 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.006293 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.015822 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-l8d6n" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.015985 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.019883 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.032180 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.032718 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.033034 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119032 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-kolla-config\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119606 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119645 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119674 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119700 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5jp\" (UniqueName: \"kubernetes.io/projected/af9932e1-c721-45b3-a213-93da4e130d05-kube-api-access-pg5jp\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119742 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119890 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af9932e1-c721-45b3-a213-93da4e130d05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.119932 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-config-data-default\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.222825 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af9932e1-c721-45b3-a213-93da4e130d05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.222881 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-config-data-default\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.222927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-kolla-config\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.222978 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.222997 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.223019 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.223034 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5jp\" (UniqueName: \"kubernetes.io/projected/af9932e1-c721-45b3-a213-93da4e130d05-kube-api-access-pg5jp\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.223065 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.223515 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af9932e1-c721-45b3-a213-93da4e130d05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.223976 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.227028 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-kolla-config\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.227981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.230148 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af9932e1-c721-45b3-a213-93da4e130d05-config-data-default\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.243887 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.245942 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af9932e1-c721-45b3-a213-93da4e130d05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.252386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.253254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5jp\" (UniqueName: \"kubernetes.io/projected/af9932e1-c721-45b3-a213-93da4e130d05-kube-api-access-pg5jp\") pod \"openstack-galera-0\" (UID: \"af9932e1-c721-45b3-a213-93da4e130d05\") " pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.356370 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 22:59:35 crc kubenswrapper[4696]: I1202 22:59:35.401890 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerStarted","Data":"ab48093ee8cde2ab7545077bde8edcbe66f76ab1abedef263fe9077cf09f066d"} Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.333243 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.335466 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.341224 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.341378 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.341463 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fn4ln" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.342030 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.349559 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.448995 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449177 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449231 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449288 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449371 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449389 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449478 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dff19c4d-2106-4034-8c29-39429553a062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.449695 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xp46\" (UniqueName: \"kubernetes.io/projected/dff19c4d-2106-4034-8c29-39429553a062-kube-api-access-6xp46\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.551673 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xp46\" (UniqueName: \"kubernetes.io/projected/dff19c4d-2106-4034-8c29-39429553a062-kube-api-access-6xp46\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.551837 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.551927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552144 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552210 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552248 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552266 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552309 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dff19c4d-2106-4034-8c29-39429553a062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.552962 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dff19c4d-2106-4034-8c29-39429553a062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.553678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.554311 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.554590 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.557791 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dff19c4d-2106-4034-8c29-39429553a062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.562377 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.569070 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff19c4d-2106-4034-8c29-39429553a062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.598508 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xp46\" (UniqueName: \"kubernetes.io/projected/dff19c4d-2106-4034-8c29-39429553a062-kube-api-access-6xp46\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.606197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dff19c4d-2106-4034-8c29-39429553a062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.638825 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.640372 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.646087 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.646377 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.646510 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kpjxv" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.653304 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.684624 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.763391 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-combined-ca-bundle\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.763473 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-config-data\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.763517 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-kolla-config\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.763539 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6pk\" (UniqueName: \"kubernetes.io/projected/87c53ac9-38ab-43a7-b99e-29c47a69f818-kube-api-access-fw6pk\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.763588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-memcached-tls-certs\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.867808 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-combined-ca-bundle\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.867922 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-config-data\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.867987 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-kolla-config\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.868020 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6pk\" (UniqueName: \"kubernetes.io/projected/87c53ac9-38ab-43a7-b99e-29c47a69f818-kube-api-access-fw6pk\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.868112 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-memcached-tls-certs\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.869606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-kolla-config\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.869606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c53ac9-38ab-43a7-b99e-29c47a69f818-config-data\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.873238 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-memcached-tls-certs\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.896514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c53ac9-38ab-43a7-b99e-29c47a69f818-combined-ca-bundle\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:36 crc kubenswrapper[4696]: I1202 22:59:36.902931 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6pk\" (UniqueName: \"kubernetes.io/projected/87c53ac9-38ab-43a7-b99e-29c47a69f818-kube-api-access-fw6pk\") pod \"memcached-0\" (UID: \"87c53ac9-38ab-43a7-b99e-29c47a69f818\") " pod="openstack/memcached-0" Dec 02 22:59:37 crc kubenswrapper[4696]: I1202 22:59:37.010620 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.658015 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.659568 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.675840 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b97qv" Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.702057 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.832251 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkp7\" (UniqueName: \"kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7\") pod \"kube-state-metrics-0\" (UID: \"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6\") " pod="openstack/kube-state-metrics-0" Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.944947 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkp7\" (UniqueName: \"kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7\") pod \"kube-state-metrics-0\" (UID: \"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6\") " pod="openstack/kube-state-metrics-0" Dec 02 22:59:38 crc kubenswrapper[4696]: I1202 22:59:38.994157 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkp7\" (UniqueName: \"kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7\") pod \"kube-state-metrics-0\" (UID: \"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6\") " pod="openstack/kube-state-metrics-0" Dec 02 22:59:39 crc kubenswrapper[4696]: I1202 22:59:39.010381 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.021499 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.025942 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.034914 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.035177 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.035541 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.037215 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2dcqn" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.048420 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.048868 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.065216 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.179750 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180225 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180258 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180337 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180357 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4f75\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180385 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.180412 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281504 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281575 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281595 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281660 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4f75\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281710 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281740 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.281782 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.283050 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.289842 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.290141 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.291349 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.303449 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.304142 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.307463 4696 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.307529 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5fca820ef489b1541daddc3ea9aef396303f957b46bb94f2636f2ae9edc8d588/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.316692 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4f75\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.482688 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:40 crc kubenswrapper[4696]: I1202 22:59:40.681955 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.831152 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-btsm6"] Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.832638 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.840401 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.843240 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.848348 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ptsrz" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.854245 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-h54st"] Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.856593 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.864970 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6"] Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.876293 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h54st"] Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915048 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvftc\" (UniqueName: \"kubernetes.io/projected/b13b6998-c04a-4ac8-9615-5078f1169ecb-kube-api-access-zvftc\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915124 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915153 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915178 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-log-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915198 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13b6998-c04a-4ac8-9615-5078f1169ecb-scripts\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915228 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-combined-ca-bundle\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:41 crc kubenswrapper[4696]: I1202 22:59:41.915266 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-ovn-controller-tls-certs\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017047 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-lib\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017097 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017221 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017396 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-ovn-controller-tls-certs\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017428 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-log\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017476 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d004daa2-5ad8-49b8-9f27-cc0552d409de-scripts\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017540 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-run\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbsr2\" (UniqueName: \"kubernetes.io/projected/d004daa2-5ad8-49b8-9f27-cc0552d409de-kube-api-access-jbsr2\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017651 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-log-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017690 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13b6998-c04a-4ac8-9615-5078f1169ecb-scripts\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017720 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017723 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-etc-ovs\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.017880 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-run\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.018063 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b13b6998-c04a-4ac8-9615-5078f1169ecb-var-log-ovn\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.018105 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-combined-ca-bundle\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.018292 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvftc\" (UniqueName: \"kubernetes.io/projected/b13b6998-c04a-4ac8-9615-5078f1169ecb-kube-api-access-zvftc\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.019635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13b6998-c04a-4ac8-9615-5078f1169ecb-scripts\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.024893 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-ovn-controller-tls-certs\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.031547 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13b6998-c04a-4ac8-9615-5078f1169ecb-combined-ca-bundle\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.044862 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvftc\" (UniqueName: \"kubernetes.io/projected/b13b6998-c04a-4ac8-9615-5078f1169ecb-kube-api-access-zvftc\") pod \"ovn-controller-btsm6\" (UID: \"b13b6998-c04a-4ac8-9615-5078f1169ecb\") " pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120146 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-etc-ovs\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120267 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-lib\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120337 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-log\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120370 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d004daa2-5ad8-49b8-9f27-cc0552d409de-scripts\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120404 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-run\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120441 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbsr2\" (UniqueName: \"kubernetes.io/projected/d004daa2-5ad8-49b8-9f27-cc0552d409de-kube-api-access-jbsr2\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120626 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-etc-ovs\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120785 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-log\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.120957 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-lib\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.121405 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d004daa2-5ad8-49b8-9f27-cc0552d409de-var-run\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.126909 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d004daa2-5ad8-49b8-9f27-cc0552d409de-scripts\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.148876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbsr2\" (UniqueName: \"kubernetes.io/projected/d004daa2-5ad8-49b8-9f27-cc0552d409de-kube-api-access-jbsr2\") pod \"ovn-controller-ovs-h54st\" (UID: \"d004daa2-5ad8-49b8-9f27-cc0552d409de\") " pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.151011 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.173121 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h54st" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.721525 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.725598 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.735092 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.736428 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2mj9p" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.737812 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.738049 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.743398 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.777380 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.847043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848335 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848481 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848571 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxl56\" (UniqueName: \"kubernetes.io/projected/bd18044c-bd73-4166-83ac-e555f2a587b3-kube-api-access-vxl56\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848704 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848843 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.848921 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.849022 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.950788 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951304 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951359 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951387 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951404 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxl56\" (UniqueName: \"kubernetes.io/projected/bd18044c-bd73-4166-83ac-e555f2a587b3-kube-api-access-vxl56\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951448 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951473 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951491 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.951942 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.952021 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.952368 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.954077 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd18044c-bd73-4166-83ac-e555f2a587b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.957656 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.957656 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.958258 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18044c-bd73-4166-83ac-e555f2a587b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.974778 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxl56\" (UniqueName: \"kubernetes.io/projected/bd18044c-bd73-4166-83ac-e555f2a587b3-kube-api-access-vxl56\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:42 crc kubenswrapper[4696]: I1202 22:59:42.978060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd18044c-bd73-4166-83ac-e555f2a587b3\") " pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:43 crc kubenswrapper[4696]: I1202 22:59:43.066052 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.713348 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.715399 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.717884 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.717989 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.721860 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q7686" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.726526 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.729260 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.811962 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812378 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812482 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812581 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812675 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd863731-0190-4818-90bb-a7b5b781e616-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812838 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.812953 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7lc9\" (UniqueName: \"kubernetes.io/projected/bd863731-0190-4818-90bb-a7b5b781e616-kube-api-access-t7lc9\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.813043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914673 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914758 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7lc9\" (UniqueName: \"kubernetes.io/projected/bd863731-0190-4818-90bb-a7b5b781e616-kube-api-access-t7lc9\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914836 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914881 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914905 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914922 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914941 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd863731-0190-4818-90bb-a7b5b781e616-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.914992 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.915485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd863731-0190-4818-90bb-a7b5b781e616-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.916310 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.916969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd863731-0190-4818-90bb-a7b5b781e616-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.922473 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.923085 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.923156 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd863731-0190-4818-90bb-a7b5b781e616-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.932266 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7lc9\" (UniqueName: \"kubernetes.io/projected/bd863731-0190-4818-90bb-a7b5b781e616-kube-api-access-t7lc9\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:45 crc kubenswrapper[4696]: I1202 22:59:45.939165 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd863731-0190-4818-90bb-a7b5b781e616\") " pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:46 crc kubenswrapper[4696]: I1202 22:59:46.043613 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 22:59:52 crc kubenswrapper[4696]: I1202 22:59:52.974397 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 22:59:52 crc kubenswrapper[4696]: I1202 22:59:52.975350 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 22:59:52 crc kubenswrapper[4696]: I1202 22:59:52.975406 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 22:59:52 crc kubenswrapper[4696]: I1202 22:59:52.976041 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 22:59:52 crc kubenswrapper[4696]: I1202 22:59:52.976100 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6" gracePeriod=600 Dec 02 22:59:53 crc kubenswrapper[4696]: I1202 22:59:53.744932 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6" exitCode=0 Dec 02 22:59:53 crc kubenswrapper[4696]: I1202 22:59:53.745315 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6"} Dec 02 22:59:53 crc kubenswrapper[4696]: I1202 22:59:53.745366 4696 scope.go:117] "RemoveContainer" containerID="488c3298a630d75021615076f70747ecaa2bb06970c4d5f097346d0dc1a68976" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.002362 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.003319 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vn9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(aa9b93b0-4131-4a4b-a1a8-27ccf68716c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.004706 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.795650 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.869133 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.869431 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-756sc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-dcqvm_openstack(e5273a22-1865-4f93-ae2f-cc7046c708cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.870888 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" podUID="e5273a22-1865-4f93-ae2f-cc7046c708cd" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.932007 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.932226 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hrg2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-njzlw_openstack(62dbb214-98c9-4d52-884b-45804bcc612c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.933403 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.938356 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.938555 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t44s2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-vrq4f_openstack(058142c0-4700-4160-857b-0b016c768a72): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.940238 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" podUID="058142c0-4700-4160-857b-0b016c768a72" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.941943 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.944714 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv57x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2bftf_openstack(d245d496-d80f-4455-a66e-cc788aff5b35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 22:59:58 crc kubenswrapper[4696]: E1202 22:59:58.946776 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" podUID="d245d496-d80f-4455-a66e-cc788aff5b35" Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.426584 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.468123 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.615368 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.689409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.767683 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.773841 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6"] Dec 02 22:59:59 crc kubenswrapper[4696]: W1202 22:59:59.783788 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13b6998_c04a_4ac8_9615_5078f1169ecb.slice/crio-565587b76ccd1a920b65d1720c9340dfc07925a197550b2c328932888e67366f WatchSource:0}: Error finding container 565587b76ccd1a920b65d1720c9340dfc07925a197550b2c328932888e67366f: Status 404 returned error can't find the container with id 565587b76ccd1a920b65d1720c9340dfc07925a197550b2c328932888e67366f Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.805888 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"87c53ac9-38ab-43a7-b99e-29c47a69f818","Type":"ContainerStarted","Data":"064749a7b2309fb46dacd05806364c3ad626911be86db4c849ce359127a9894b"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.809443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"af9932e1-c721-45b3-a213-93da4e130d05","Type":"ContainerStarted","Data":"92358e219be4c7836bfa5461d996b153a19b0a3b4a5b812ea68aebb44dd3a0f9"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.810835 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6" event={"ID":"b13b6998-c04a-4ac8-9615-5078f1169ecb","Type":"ContainerStarted","Data":"565587b76ccd1a920b65d1720c9340dfc07925a197550b2c328932888e67366f"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.814011 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dff19c4d-2106-4034-8c29-39429553a062","Type":"ContainerStarted","Data":"26f955f92ca63f3ef96a737dc169687881ef5a3b12e6b80d8fbbd88e402e0b94"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.816642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.818099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerStarted","Data":"ed87a9917924ecfc5f3b8f8bbb6a8494314470b03a6c2a43f1bc5b1b189531e8"} Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.819079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6","Type":"ContainerStarted","Data":"0cf3bff85eb2b138daafe92d1c9d1ea9b0bbde6a676933eb5ccf7f912c231d37"} Dec 02 22:59:59 crc kubenswrapper[4696]: E1202 22:59:59.820896 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" Dec 02 22:59:59 crc kubenswrapper[4696]: E1202 22:59:59.820919 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" podUID="058142c0-4700-4160-857b-0b016c768a72" Dec 02 22:59:59 crc kubenswrapper[4696]: I1202 22:59:59.857971 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 22:59:59 crc kubenswrapper[4696]: W1202 22:59:59.870195 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd863731_0190_4818_90bb_a7b5b781e616.slice/crio-5dd749b4b6bb7380d82ef4d156ce92ec23803970d257d150ab9a1b69cef6ae5b WatchSource:0}: Error finding container 5dd749b4b6bb7380d82ef4d156ce92ec23803970d257d150ab9a1b69cef6ae5b: Status 404 returned error can't find the container with id 5dd749b4b6bb7380d82ef4d156ce92ec23803970d257d150ab9a1b69cef6ae5b Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.175064 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.177472 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.180951 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.180968 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.194205 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.258362 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5cbz\" (UniqueName: \"kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.258433 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.258549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.360015 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.360088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5cbz\" (UniqueName: \"kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.360456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.364813 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.375989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.379277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5cbz\" (UniqueName: \"kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz\") pod \"collect-profiles-29411940-z9xq4\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.441347 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.455090 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.469279 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.568209 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config\") pod \"d245d496-d80f-4455-a66e-cc788aff5b35\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.568278 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756sc\" (UniqueName: \"kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc\") pod \"e5273a22-1865-4f93-ae2f-cc7046c708cd\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.568324 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv57x\" (UniqueName: \"kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x\") pod \"d245d496-d80f-4455-a66e-cc788aff5b35\" (UID: \"d245d496-d80f-4455-a66e-cc788aff5b35\") " Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.568374 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc\") pod \"e5273a22-1865-4f93-ae2f-cc7046c708cd\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.568732 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config" (OuterVolumeSpecName: "config") pod "d245d496-d80f-4455-a66e-cc788aff5b35" (UID: "d245d496-d80f-4455-a66e-cc788aff5b35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.569423 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5273a22-1865-4f93-ae2f-cc7046c708cd" (UID: "e5273a22-1865-4f93-ae2f-cc7046c708cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.569528 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config\") pod \"e5273a22-1865-4f93-ae2f-cc7046c708cd\" (UID: \"e5273a22-1865-4f93-ae2f-cc7046c708cd\") " Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.570053 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config" (OuterVolumeSpecName: "config") pod "e5273a22-1865-4f93-ae2f-cc7046c708cd" (UID: "e5273a22-1865-4f93-ae2f-cc7046c708cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.570482 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.570837 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d245d496-d80f-4455-a66e-cc788aff5b35-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.570847 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5273a22-1865-4f93-ae2f-cc7046c708cd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.574976 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc" (OuterVolumeSpecName: "kube-api-access-756sc") pod "e5273a22-1865-4f93-ae2f-cc7046c708cd" (UID: "e5273a22-1865-4f93-ae2f-cc7046c708cd"). InnerVolumeSpecName "kube-api-access-756sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.575378 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x" (OuterVolumeSpecName: "kube-api-access-mv57x") pod "d245d496-d80f-4455-a66e-cc788aff5b35" (UID: "d245d496-d80f-4455-a66e-cc788aff5b35"). InnerVolumeSpecName "kube-api-access-mv57x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.673553 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756sc\" (UniqueName: \"kubernetes.io/projected/e5273a22-1865-4f93-ae2f-cc7046c708cd-kube-api-access-756sc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.673591 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv57x\" (UniqueName: \"kubernetes.io/projected/d245d496-d80f-4455-a66e-cc788aff5b35-kube-api-access-mv57x\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.679387 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.782521 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h54st"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.829622 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" event={"ID":"d245d496-d80f-4455-a66e-cc788aff5b35","Type":"ContainerDied","Data":"f6e9678be081afab8b58067fe8bb81b12f26d93289411399695fd2149a0f1dab"} Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.829927 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2bftf" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.832399 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.832463 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dcqvm" event={"ID":"e5273a22-1865-4f93-ae2f-cc7046c708cd","Type":"ContainerDied","Data":"6159af933813b8732d03d45cade9197aaacf662a2f2a069c6cbb802120aefae3"} Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.834796 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerStarted","Data":"72dadaa892b741f7a71d45cdf9cad76ee8227c1f1237e87e17a38b7792ae3aa3"} Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.839512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd863731-0190-4818-90bb-a7b5b781e616","Type":"ContainerStarted","Data":"5dd749b4b6bb7380d82ef4d156ce92ec23803970d257d150ab9a1b69cef6ae5b"} Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.936361 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.960238 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2bftf"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.975498 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 23:00:00 crc kubenswrapper[4696]: I1202 23:00:00.984108 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dcqvm"] Dec 02 23:00:01 crc kubenswrapper[4696]: I1202 23:00:01.462220 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d245d496-d80f-4455-a66e-cc788aff5b35" path="/var/lib/kubelet/pods/d245d496-d80f-4455-a66e-cc788aff5b35/volumes" Dec 02 23:00:01 crc kubenswrapper[4696]: I1202 23:00:01.462661 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5273a22-1865-4f93-ae2f-cc7046c708cd" path="/var/lib/kubelet/pods/e5273a22-1865-4f93-ae2f-cc7046c708cd/volumes" Dec 02 23:00:01 crc kubenswrapper[4696]: W1202 23:00:01.762027 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd004daa2_5ad8_49b8_9f27_cc0552d409de.slice/crio-8a01352f6ab9d85819820da32592bfa09e1bc65881fb4c5aa4121766edbe8b79 WatchSource:0}: Error finding container 8a01352f6ab9d85819820da32592bfa09e1bc65881fb4c5aa4121766edbe8b79: Status 404 returned error can't find the container with id 8a01352f6ab9d85819820da32592bfa09e1bc65881fb4c5aa4121766edbe8b79 Dec 02 23:00:01 crc kubenswrapper[4696]: I1202 23:00:01.848875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd18044c-bd73-4166-83ac-e555f2a587b3","Type":"ContainerStarted","Data":"81f499618ef2764d8e07f165db944d1f916e06c93b13931b1ace7d07a9721811"} Dec 02 23:00:01 crc kubenswrapper[4696]: I1202 23:00:01.850348 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h54st" event={"ID":"d004daa2-5ad8-49b8-9f27-cc0552d409de","Type":"ContainerStarted","Data":"8a01352f6ab9d85819820da32592bfa09e1bc65881fb4c5aa4121766edbe8b79"} Dec 02 23:00:02 crc kubenswrapper[4696]: I1202 23:00:02.282563 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4"] Dec 02 23:00:04 crc kubenswrapper[4696]: I1202 23:00:04.877123 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" event={"ID":"c442b172-a329-4974-a896-e36bd604cf10","Type":"ContainerStarted","Data":"3c9ba3e7530e48e956edd4455edbec463521d4607f0d5f25ead9d2c4e38a7e2d"} Dec 02 23:00:08 crc kubenswrapper[4696]: I1202 23:00:08.959552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dff19c4d-2106-4034-8c29-39429553a062","Type":"ContainerStarted","Data":"b737fb1a6f35663f038fe7a947ce1249ca579567ca680dab8694d76892dd285f"} Dec 02 23:00:08 crc kubenswrapper[4696]: I1202 23:00:08.967781 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" event={"ID":"c442b172-a329-4974-a896-e36bd604cf10","Type":"ContainerStarted","Data":"b475b52b8fa8826639dce84ea06bb0dae3cfbe33afdf08411b93969ae748ee08"} Dec 02 23:00:08 crc kubenswrapper[4696]: I1202 23:00:08.974712 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"87c53ac9-38ab-43a7-b99e-29c47a69f818","Type":"ContainerStarted","Data":"22a2718862810b9fbedc45c7d5518262660afadd38623c45b2743558371fcd5f"} Dec 02 23:00:08 crc kubenswrapper[4696]: I1202 23:00:08.975417 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 23:00:09 crc kubenswrapper[4696]: I1202 23:00:09.026513 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" podStartSLOduration=9.026492339 podStartE2EDuration="9.026492339s" podCreationTimestamp="2025-12-02 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:09.019932294 +0000 UTC m=+1071.900612285" watchObservedRunningTime="2025-12-02 23:00:09.026492339 +0000 UTC m=+1071.907172340" Dec 02 23:00:09 crc kubenswrapper[4696]: I1202 23:00:09.996572 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"af9932e1-c721-45b3-a213-93da4e130d05","Type":"ContainerStarted","Data":"539cc2b0b3877dc0ea869ac3406577eaf242100e064017bb960c96d522acd2d3"} Dec 02 23:00:09 crc kubenswrapper[4696]: I1202 23:00:09.999665 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd863731-0190-4818-90bb-a7b5b781e616","Type":"ContainerStarted","Data":"1a5eb43a4c4ac0463e5159dcb4793254ee029178f0c36b0fd8c5e1f909f60c6e"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.001002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd18044c-bd73-4166-83ac-e555f2a587b3","Type":"ContainerStarted","Data":"637ab6637a1fcd2177b9ad38b0b0c3e2b3931a68733ce27c500556a5b08cd738"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.010267 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6" event={"ID":"b13b6998-c04a-4ac8-9615-5078f1169ecb","Type":"ContainerStarted","Data":"14d8301e6d5c707e0c3ff16927c3a1c0c34e52fc9fc24fa689ca8a8d9cfad491"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.010440 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-btsm6" Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.015532 4696 generic.go:334] "Generic (PLEG): container finished" podID="d004daa2-5ad8-49b8-9f27-cc0552d409de" containerID="faa12af3378e74a36c8eb98ad5f8609d91ca701001a19892f0b75d3e0550d965" exitCode=0 Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.015629 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h54st" event={"ID":"d004daa2-5ad8-49b8-9f27-cc0552d409de","Type":"ContainerDied","Data":"faa12af3378e74a36c8eb98ad5f8609d91ca701001a19892f0b75d3e0550d965"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.023261 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6","Type":"ContainerStarted","Data":"9ea4fe4d79119b702b85dc57f5455fe3da7151c76af7e78aec68d6f8390427f9"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.023380 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.036992 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=28.049851038 podStartE2EDuration="34.036972968s" podCreationTimestamp="2025-12-02 22:59:36 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.440269392 +0000 UTC m=+1062.320949393" lastFinishedPulling="2025-12-02 23:00:05.427391312 +0000 UTC m=+1068.308071323" observedRunningTime="2025-12-02 23:00:09.045964769 +0000 UTC m=+1071.926644770" watchObservedRunningTime="2025-12-02 23:00:10.036972968 +0000 UTC m=+1072.917652969" Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.048116 4696 generic.go:334] "Generic (PLEG): container finished" podID="c442b172-a329-4974-a896-e36bd604cf10" containerID="b475b52b8fa8826639dce84ea06bb0dae3cfbe33afdf08411b93969ae748ee08" exitCode=0 Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.048482 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" event={"ID":"c442b172-a329-4974-a896-e36bd604cf10","Type":"ContainerDied","Data":"b475b52b8fa8826639dce84ea06bb0dae3cfbe33afdf08411b93969ae748ee08"} Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.084097 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.002651148 podStartE2EDuration="32.084075198s" podCreationTimestamp="2025-12-02 22:59:38 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.628988302 +0000 UTC m=+1062.509668303" lastFinishedPulling="2025-12-02 23:00:08.710412342 +0000 UTC m=+1071.591092353" observedRunningTime="2025-12-02 23:00:10.080134337 +0000 UTC m=+1072.960814328" watchObservedRunningTime="2025-12-02 23:00:10.084075198 +0000 UTC m=+1072.964755199" Dec 02 23:00:10 crc kubenswrapper[4696]: I1202 23:00:10.115697 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-btsm6" podStartSLOduration=20.730819033 podStartE2EDuration="29.1156677s" podCreationTimestamp="2025-12-02 22:59:41 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.785336608 +0000 UTC m=+1062.666016609" lastFinishedPulling="2025-12-02 23:00:08.170185275 +0000 UTC m=+1071.050865276" observedRunningTime="2025-12-02 23:00:10.111909514 +0000 UTC m=+1072.992589515" watchObservedRunningTime="2025-12-02 23:00:10.1156677 +0000 UTC m=+1072.996347701" Dec 02 23:00:11 crc kubenswrapper[4696]: I1202 23:00:11.064368 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h54st" event={"ID":"d004daa2-5ad8-49b8-9f27-cc0552d409de","Type":"ContainerStarted","Data":"baf11a689271fb1e132b86781d48879de27a718a5b24d7441eac2d609362f57b"} Dec 02 23:00:11 crc kubenswrapper[4696]: I1202 23:00:11.064427 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h54st" event={"ID":"d004daa2-5ad8-49b8-9f27-cc0552d409de","Type":"ContainerStarted","Data":"749610c11ad94022b9e240266ab77afdf5dfb0c67fafe278c1ced9466596e749"} Dec 02 23:00:11 crc kubenswrapper[4696]: I1202 23:00:11.066332 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h54st" Dec 02 23:00:11 crc kubenswrapper[4696]: I1202 23:00:11.066440 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h54st" Dec 02 23:00:11 crc kubenswrapper[4696]: I1202 23:00:11.103993 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-h54st" podStartSLOduration=23.302307217 podStartE2EDuration="30.103967181s" podCreationTimestamp="2025-12-02 22:59:41 +0000 UTC" firstStartedPulling="2025-12-02 23:00:01.797726463 +0000 UTC m=+1064.678406464" lastFinishedPulling="2025-12-02 23:00:08.599386417 +0000 UTC m=+1071.480066428" observedRunningTime="2025-12-02 23:00:11.092434306 +0000 UTC m=+1073.973114317" watchObservedRunningTime="2025-12-02 23:00:11.103967181 +0000 UTC m=+1073.984647182" Dec 02 23:00:12 crc kubenswrapper[4696]: I1202 23:00:12.076982 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerStarted","Data":"61faf3799cade00d76f6f556cecd8a6d1e003ea4b965e0f6b5ec2c6dd2210bee"} Dec 02 23:00:12 crc kubenswrapper[4696]: I1202 23:00:12.932078 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.056609 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume\") pod \"c442b172-a329-4974-a896-e36bd604cf10\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.056710 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5cbz\" (UniqueName: \"kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz\") pod \"c442b172-a329-4974-a896-e36bd604cf10\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.056845 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume\") pod \"c442b172-a329-4974-a896-e36bd604cf10\" (UID: \"c442b172-a329-4974-a896-e36bd604cf10\") " Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.057793 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume" (OuterVolumeSpecName: "config-volume") pod "c442b172-a329-4974-a896-e36bd604cf10" (UID: "c442b172-a329-4974-a896-e36bd604cf10"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.080787 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz" (OuterVolumeSpecName: "kube-api-access-g5cbz") pod "c442b172-a329-4974-a896-e36bd604cf10" (UID: "c442b172-a329-4974-a896-e36bd604cf10"). InnerVolumeSpecName "kube-api-access-g5cbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.081966 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c442b172-a329-4974-a896-e36bd604cf10" (UID: "c442b172-a329-4974-a896-e36bd604cf10"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.089336 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.089323 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4" event={"ID":"c442b172-a329-4974-a896-e36bd604cf10","Type":"ContainerDied","Data":"3c9ba3e7530e48e956edd4455edbec463521d4607f0d5f25ead9d2c4e38a7e2d"} Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.089464 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9ba3e7530e48e956edd4455edbec463521d4607f0d5f25ead9d2c4e38a7e2d" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.093319 4696 generic.go:334] "Generic (PLEG): container finished" podID="af9932e1-c721-45b3-a213-93da4e130d05" containerID="539cc2b0b3877dc0ea869ac3406577eaf242100e064017bb960c96d522acd2d3" exitCode=0 Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.093421 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"af9932e1-c721-45b3-a213-93da4e130d05","Type":"ContainerDied","Data":"539cc2b0b3877dc0ea869ac3406577eaf242100e064017bb960c96d522acd2d3"} Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.097398 4696 generic.go:334] "Generic (PLEG): container finished" podID="dff19c4d-2106-4034-8c29-39429553a062" containerID="b737fb1a6f35663f038fe7a947ce1249ca579567ca680dab8694d76892dd285f" exitCode=0 Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.098649 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dff19c4d-2106-4034-8c29-39429553a062","Type":"ContainerDied","Data":"b737fb1a6f35663f038fe7a947ce1249ca579567ca680dab8694d76892dd285f"} Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.159500 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c442b172-a329-4974-a896-e36bd604cf10-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.159537 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5cbz\" (UniqueName: \"kubernetes.io/projected/c442b172-a329-4974-a896-e36bd604cf10-kube-api-access-g5cbz\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:13 crc kubenswrapper[4696]: I1202 23:00:13.159547 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c442b172-a329-4974-a896-e36bd604cf10-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.109353 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd18044c-bd73-4166-83ac-e555f2a587b3","Type":"ContainerStarted","Data":"0374661794b91719b74bf8bd82940534b7066e047793ee0f2912539faffb8647"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.116849 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dff19c4d-2106-4034-8c29-39429553a062","Type":"ContainerStarted","Data":"c37a87cd321d0143aac29b1783db91f1bdd6a0dddee7cb8fc43e91682aa2c121"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.120525 4696 generic.go:334] "Generic (PLEG): container finished" podID="058142c0-4700-4160-857b-0b016c768a72" containerID="efbc5ec233e6075dce422e9325ab9d86f887d1f103d3111c6f94236da9da8e1a" exitCode=0 Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.120644 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" event={"ID":"058142c0-4700-4160-857b-0b016c768a72","Type":"ContainerDied","Data":"efbc5ec233e6075dce422e9325ab9d86f887d1f103d3111c6f94236da9da8e1a"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.125944 4696 generic.go:334] "Generic (PLEG): container finished" podID="62dbb214-98c9-4d52-884b-45804bcc612c" containerID="a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764" exitCode=0 Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.125977 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" event={"ID":"62dbb214-98c9-4d52-884b-45804bcc612c","Type":"ContainerDied","Data":"a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.136215 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.660637343 podStartE2EDuration="33.136189228s" podCreationTimestamp="2025-12-02 22:59:41 +0000 UTC" firstStartedPulling="2025-12-02 23:00:01.797805835 +0000 UTC m=+1064.678485836" lastFinishedPulling="2025-12-02 23:00:13.27335772 +0000 UTC m=+1076.154037721" observedRunningTime="2025-12-02 23:00:14.130725414 +0000 UTC m=+1077.011405485" watchObservedRunningTime="2025-12-02 23:00:14.136189228 +0000 UTC m=+1077.016869229" Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.137352 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"af9932e1-c721-45b3-a213-93da4e130d05","Type":"ContainerStarted","Data":"255290ff819b73614b910675375f43a82841e88098de09819f535ac5e319f658"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.141922 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd863731-0190-4818-90bb-a7b5b781e616","Type":"ContainerStarted","Data":"64613d02f475b91304b93f8abcc41641053f6f30ff1a16bc9182a4eee2a599ed"} Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.170516 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.974900634 podStartE2EDuration="39.170467586s" podCreationTimestamp="2025-12-02 22:59:35 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.699315978 +0000 UTC m=+1062.579995979" lastFinishedPulling="2025-12-02 23:00:07.89488291 +0000 UTC m=+1070.775562931" observedRunningTime="2025-12-02 23:00:14.157081138 +0000 UTC m=+1077.037761139" watchObservedRunningTime="2025-12-02 23:00:14.170467586 +0000 UTC m=+1077.051147597" Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.240850 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.581077752 podStartE2EDuration="41.240826203s" podCreationTimestamp="2025-12-02 22:59:33 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.510373032 +0000 UTC m=+1062.391053033" lastFinishedPulling="2025-12-02 23:00:08.170121473 +0000 UTC m=+1071.050801484" observedRunningTime="2025-12-02 23:00:14.231218022 +0000 UTC m=+1077.111898033" watchObservedRunningTime="2025-12-02 23:00:14.240826203 +0000 UTC m=+1077.121506214" Dec 02 23:00:14 crc kubenswrapper[4696]: I1202 23:00:14.256414 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.844737797 podStartE2EDuration="30.256400003s" podCreationTimestamp="2025-12-02 22:59:44 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.876935995 +0000 UTC m=+1062.757615996" lastFinishedPulling="2025-12-02 23:00:13.288598201 +0000 UTC m=+1076.169278202" observedRunningTime="2025-12-02 23:00:14.255111426 +0000 UTC m=+1077.135791467" watchObservedRunningTime="2025-12-02 23:00:14.256400003 +0000 UTC m=+1077.137080004" Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.159478 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerStarted","Data":"b1b6c8104bd2c4548eaf0e047605a944629556d000273f410634d938caa54ca8"} Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.162633 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" event={"ID":"62dbb214-98c9-4d52-884b-45804bcc612c","Type":"ContainerStarted","Data":"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f"} Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.162939 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.166195 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" event={"ID":"058142c0-4700-4160-857b-0b016c768a72","Type":"ContainerStarted","Data":"256064e930c01772e5e7da068d7eb524a2364890ef4b92f327ada11472f710f2"} Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.233423 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" podStartSLOduration=3.539402635 podStartE2EDuration="43.233394216s" podCreationTimestamp="2025-12-02 22:59:32 +0000 UTC" firstStartedPulling="2025-12-02 22:59:33.561307079 +0000 UTC m=+1036.441987080" lastFinishedPulling="2025-12-02 23:00:13.25529866 +0000 UTC m=+1076.135978661" observedRunningTime="2025-12-02 23:00:15.223719682 +0000 UTC m=+1078.104399693" watchObservedRunningTime="2025-12-02 23:00:15.233394216 +0000 UTC m=+1078.114074227" Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.254858 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" podStartSLOduration=3.219076388 podStartE2EDuration="43.254821721s" podCreationTimestamp="2025-12-02 22:59:32 +0000 UTC" firstStartedPulling="2025-12-02 22:59:33.267881602 +0000 UTC m=+1036.148561603" lastFinishedPulling="2025-12-02 23:00:13.303626935 +0000 UTC m=+1076.184306936" observedRunningTime="2025-12-02 23:00:15.251338973 +0000 UTC m=+1078.132019014" watchObservedRunningTime="2025-12-02 23:00:15.254821721 +0000 UTC m=+1078.135501752" Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.357822 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 23:00:15 crc kubenswrapper[4696]: I1202 23:00:15.357906 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.044336 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.044588 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.067008 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.123895 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.139968 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.174215 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.219763 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.222461 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.418929 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.487270 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:16 crc kubenswrapper[4696]: E1202 23:00:16.487664 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c442b172-a329-4974-a896-e36bd604cf10" containerName="collect-profiles" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.487681 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c442b172-a329-4974-a896-e36bd604cf10" containerName="collect-profiles" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.487874 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c442b172-a329-4974-a896-e36bd604cf10" containerName="collect-profiles" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.488845 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.491235 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.509656 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.541944 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.542075 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h585f\" (UniqueName: \"kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.542148 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.542167 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.590755 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-f6gwc"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.602635 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.608686 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.609387 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-f6gwc"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.644306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h585f\" (UniqueName: \"kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.644593 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.644674 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovs-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.644883 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-combined-ca-bundle\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645297 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645349 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f108f43-c0d4-4026-9f97-3a2fc3698626-config\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645504 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovn-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645535 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.645618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg27l\" (UniqueName: \"kubernetes.io/projected/5f108f43-c0d4-4026-9f97-3a2fc3698626-kube-api-access-zg27l\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.646622 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.647221 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.648070 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.692397 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.694380 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.706159 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-svg82" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.707505 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.707582 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.707719 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.708704 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.710409 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.726705 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.727257 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="dnsmasq-dns" containerID="cri-o://256064e930c01772e5e7da068d7eb524a2364890ef4b92f327ada11472f710f2" gracePeriod=10 Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.728326 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747503 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f108f43-c0d4-4026-9f97-3a2fc3698626-config\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747599 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovn-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747650 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg27l\" (UniqueName: \"kubernetes.io/projected/5f108f43-c0d4-4026-9f97-3a2fc3698626-kube-api-access-zg27l\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747774 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747814 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovs-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.747846 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-combined-ca-bundle\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.757195 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-combined-ca-bundle\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.757641 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.758252 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h585f\" (UniqueName: \"kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f\") pod \"dnsmasq-dns-6bc7876d45-8gns4\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.758430 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovn-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.758535 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5f108f43-c0d4-4026-9f97-3a2fc3698626-ovs-rundir\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.760474 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f108f43-c0d4-4026-9f97-3a2fc3698626-config\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.782672 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f108f43-c0d4-4026-9f97-3a2fc3698626-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.783391 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.784891 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.790022 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.790186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg27l\" (UniqueName: \"kubernetes.io/projected/5f108f43-c0d4-4026-9f97-3a2fc3698626-kube-api-access-zg27l\") pod \"ovn-controller-metrics-f6gwc\" (UID: \"5f108f43-c0d4-4026-9f97-3a2fc3698626\") " pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.795074 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.813415 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.849793 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.849880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.849910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-config\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.849966 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-scripts\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.849996 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dl2g\" (UniqueName: \"kubernetes.io/projected/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-kube-api-access-5dl2g\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.850019 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.850043 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.932046 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-f6gwc" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.951593 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952061 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952088 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952110 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-config\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952198 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952226 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtn9\" (UniqueName: \"kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952250 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-scripts\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952275 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dl2g\" (UniqueName: \"kubernetes.io/projected/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-kube-api-access-5dl2g\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952298 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.952317 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.953927 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-config\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.955565 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.958331 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-scripts\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.959457 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.961712 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.962559 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:16 crc kubenswrapper[4696]: I1202 23:00:16.982171 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dl2g\" (UniqueName: \"kubernetes.io/projected/87dfe190-5b7f-48c2-bfa0-97ca227eabb2-kube-api-access-5dl2g\") pod \"ovn-northd-0\" (UID: \"87dfe190-5b7f-48c2-bfa0-97ca227eabb2\") " pod="openstack/ovn-northd-0" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.016092 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.055305 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.055403 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.055431 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtn9\" (UniqueName: \"kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.055533 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.055592 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.057963 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.058773 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.059341 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.061074 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.105627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtn9\" (UniqueName: \"kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9\") pod \"dnsmasq-dns-8554648995-742ff\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.250284 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.262864 4696 generic.go:334] "Generic (PLEG): container finished" podID="058142c0-4700-4160-857b-0b016c768a72" containerID="256064e930c01772e5e7da068d7eb524a2364890ef4b92f327ada11472f710f2" exitCode=0 Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.263143 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="dnsmasq-dns" containerID="cri-o://d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f" gracePeriod=10 Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.263293 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" event={"ID":"058142c0-4700-4160-857b-0b016c768a72","Type":"ContainerDied","Data":"256064e930c01772e5e7da068d7eb524a2364890ef4b92f327ada11472f710f2"} Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.359626 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.505572 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:17 crc kubenswrapper[4696]: I1202 23:00:17.964259 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-f6gwc"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.152691 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.197100 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.282449 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.282294 4696 generic.go:334] "Generic (PLEG): container finished" podID="62dbb214-98c9-4d52-884b-45804bcc612c" containerID="d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f" exitCode=0 Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.282576 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" event={"ID":"62dbb214-98c9-4d52-884b-45804bcc612c","Type":"ContainerDied","Data":"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.284145 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-njzlw" event={"ID":"62dbb214-98c9-4d52-884b-45804bcc612c","Type":"ContainerDied","Data":"0785993c643c6e8b5fba249ca02dc471648602e7cc0718997c84e526fdf79fb6"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.284175 4696 scope.go:117] "RemoveContainer" containerID="d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.298923 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-f6gwc" event={"ID":"5f108f43-c0d4-4026-9f97-3a2fc3698626","Type":"ContainerStarted","Data":"4aa1ee8362f6493c55fe318c6e0b3724e702973538b04c5b9c795983f4d31695"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.305799 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerStarted","Data":"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.305855 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerStarted","Data":"1f44f3c46adb6f83b05146df450595f7fce200d53545b0cab38e7f03fec76209"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.316849 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.317022 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrq4f" event={"ID":"058142c0-4700-4160-857b-0b016c768a72","Type":"ContainerDied","Data":"9c0c84a318a1782facba5e183682c37b84fc298b3240b08de16f340c6aef4d96"} Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329331 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrg2t\" (UniqueName: \"kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t\") pod \"62dbb214-98c9-4d52-884b-45804bcc612c\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329375 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config\") pod \"62dbb214-98c9-4d52-884b-45804bcc612c\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329528 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc\") pod \"058142c0-4700-4160-857b-0b016c768a72\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329567 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc\") pod \"62dbb214-98c9-4d52-884b-45804bcc612c\" (UID: \"62dbb214-98c9-4d52-884b-45804bcc612c\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329596 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config\") pod \"058142c0-4700-4160-857b-0b016c768a72\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.329720 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t44s2\" (UniqueName: \"kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2\") pod \"058142c0-4700-4160-857b-0b016c768a72\" (UID: \"058142c0-4700-4160-857b-0b016c768a72\") " Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.353792 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t" (OuterVolumeSpecName: "kube-api-access-hrg2t") pod "62dbb214-98c9-4d52-884b-45804bcc612c" (UID: "62dbb214-98c9-4d52-884b-45804bcc612c"). InnerVolumeSpecName "kube-api-access-hrg2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.358464 4696 scope.go:117] "RemoveContainer" containerID="a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.380863 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2" (OuterVolumeSpecName: "kube-api-access-t44s2") pod "058142c0-4700-4160-857b-0b016c768a72" (UID: "058142c0-4700-4160-857b-0b016c768a72"). InnerVolumeSpecName "kube-api-access-t44s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.388399 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.415302 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.417603 4696 scope.go:117] "RemoveContainer" containerID="d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.418537 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "058142c0-4700-4160-857b-0b016c768a72" (UID: "058142c0-4700-4160-857b-0b016c768a72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: E1202 23:00:18.431323 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f\": container with ID starting with d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f not found: ID does not exist" containerID="d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.431387 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f"} err="failed to get container status \"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f\": rpc error: code = NotFound desc = could not find container \"d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f\": container with ID starting with d1e9c45eff515960ade5e8b56fbc94cda0dd01e3ce103da0cc0388952b086a5f not found: ID does not exist" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.431425 4696 scope.go:117] "RemoveContainer" containerID="a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.432888 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t44s2\" (UniqueName: \"kubernetes.io/projected/058142c0-4700-4160-857b-0b016c768a72-kube-api-access-t44s2\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: E1202 23:00:18.433625 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764\": container with ID starting with a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764 not found: ID does not exist" containerID="a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.433726 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764"} err="failed to get container status \"a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764\": rpc error: code = NotFound desc = could not find container \"a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764\": container with ID starting with a02154da98abb16e5f162006130a32db64d195cef1fea918b3cea553da0e9764 not found: ID does not exist" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.433832 4696 scope.go:117] "RemoveContainer" containerID="256064e930c01772e5e7da068d7eb524a2364890ef4b92f327ada11472f710f2" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.434167 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrg2t\" (UniqueName: \"kubernetes.io/projected/62dbb214-98c9-4d52-884b-45804bcc612c-kube-api-access-hrg2t\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.434207 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.459557 4696 scope.go:117] "RemoveContainer" containerID="efbc5ec233e6075dce422e9325ab9d86f887d1f103d3111c6f94236da9da8e1a" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.460772 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config" (OuterVolumeSpecName: "config") pod "62dbb214-98c9-4d52-884b-45804bcc612c" (UID: "62dbb214-98c9-4d52-884b-45804bcc612c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.468731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config" (OuterVolumeSpecName: "config") pod "058142c0-4700-4160-857b-0b016c768a72" (UID: "058142c0-4700-4160-857b-0b016c768a72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.475559 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62dbb214-98c9-4d52-884b-45804bcc612c" (UID: "62dbb214-98c9-4d52-884b-45804bcc612c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.536012 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.536173 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dbb214-98c9-4d52-884b-45804bcc612c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.536235 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058142c0-4700-4160-857b-0b016c768a72-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.629767 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.638100 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-njzlw"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.654924 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 23:00:18 crc kubenswrapper[4696]: I1202 23:00:18.663582 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrq4f"] Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.029482 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.165110 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.183330 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.183799 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="init" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.183811 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="init" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.183826 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.183832 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.183864 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="init" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.183870 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="init" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.183885 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.183891 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.184080 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="058142c0-4700-4160-857b-0b016c768a72" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.184096 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" containerName="dnsmasq-dns" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.185091 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.202904 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.253912 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.254064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.254218 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rcf\" (UniqueName: \"kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.254275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.254299 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.326107 4696 generic.go:334] "Generic (PLEG): container finished" podID="da62a579-5b70-47cb-8666-2b6a785a2052" containerID="082cb5c7fb9fbddee7810c5efd7e7bc1062252c769e8bd1d203f6f3a714a151b" exitCode=0 Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.326176 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-742ff" event={"ID":"da62a579-5b70-47cb-8666-2b6a785a2052","Type":"ContainerDied","Data":"082cb5c7fb9fbddee7810c5efd7e7bc1062252c769e8bd1d203f6f3a714a151b"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.326206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-742ff" event={"ID":"da62a579-5b70-47cb-8666-2b6a785a2052","Type":"ContainerStarted","Data":"acbb3ea80be119b5bc913a193b4a0b6109ea9649cc7cafd859e6d4d3a4770798"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.334725 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87dfe190-5b7f-48c2-bfa0-97ca227eabb2","Type":"ContainerStarted","Data":"d7ab73f3bb025b2fa48d08effa5ff153a43417ebb5546e67032d3a2bfa52f58e"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357210 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rcf\" (UniqueName: \"kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357239 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357260 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.357282 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-f6gwc" event={"ID":"5f108f43-c0d4-4026-9f97-3a2fc3698626","Type":"ContainerStarted","Data":"1f33eabc4ec77efaadd12b8fd11c20012f57f641a7a93a41aa38821ff77280b7"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.358212 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.362497 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.363268 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.365578 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.397035 4696 generic.go:334] "Generic (PLEG): container finished" podID="989c3427-3369-49e2-b8a9-d9706298c06a" containerID="d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62" exitCode=0 Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.397104 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerDied","Data":"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.397140 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerStarted","Data":"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329"} Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.398059 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.418208 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rcf\" (UniqueName: \"kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf\") pod \"dnsmasq-dns-b8fbc5445-btlkn\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.474180 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-f6gwc" podStartSLOduration=3.474155735 podStartE2EDuration="3.474155735s" podCreationTimestamp="2025-12-02 23:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:19.413518312 +0000 UTC m=+1082.294198313" watchObservedRunningTime="2025-12-02 23:00:19.474155735 +0000 UTC m=+1082.354835736" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.501821 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058142c0-4700-4160-857b-0b016c768a72" path="/var/lib/kubelet/pods/058142c0-4700-4160-857b-0b016c768a72/volumes" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.502554 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62dbb214-98c9-4d52-884b-45804bcc612c" path="/var/lib/kubelet/pods/62dbb214-98c9-4d52-884b-45804bcc612c/volumes" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.535198 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.716903 4696 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 23:00:19 crc kubenswrapper[4696]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/da62a579-5b70-47cb-8666-2b6a785a2052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 23:00:19 crc kubenswrapper[4696]: > podSandboxID="acbb3ea80be119b5bc913a193b4a0b6109ea9649cc7cafd859e6d4d3a4770798" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.717433 4696 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 23:00:19 crc kubenswrapper[4696]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgtn9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-742ff_openstack(da62a579-5b70-47cb-8666-2b6a785a2052): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/da62a579-5b70-47cb-8666-2b6a785a2052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 23:00:19 crc kubenswrapper[4696]: > logger="UnhandledError" Dec 02 23:00:19 crc kubenswrapper[4696]: E1202 23:00:19.722850 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/da62a579-5b70-47cb-8666-2b6a785a2052/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-742ff" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.770612 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.803933 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" podStartSLOduration=3.803910368 podStartE2EDuration="3.803910368s" podCreationTimestamp="2025-12-02 23:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:19.482965544 +0000 UTC m=+1082.363645545" watchObservedRunningTime="2025-12-02 23:00:19.803910368 +0000 UTC m=+1082.684590369" Dec 02 23:00:19 crc kubenswrapper[4696]: I1202 23:00:19.947472 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.087722 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.094391 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.099485 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.099600 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qvplc" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.099829 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.100068 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.123541 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.174655 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kq468"] Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.177717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.178919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcv9t\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-kube-api-access-qcv9t\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.179001 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-lock\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.179097 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.179126 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-cache\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.179147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.181545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.181853 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.182103 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.198349 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kq468"] Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.215519 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281286 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281372 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpgw\" (UniqueName: \"kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281435 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281486 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-cache\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281558 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281615 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcv9t\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-kube-api-access-qcv9t\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281639 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281662 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281698 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.281724 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-lock\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.282111 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.282146 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.282164 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.282217 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:20.782195646 +0000 UTC m=+1083.662875647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.282245 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-cache\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.283123 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d5594f21-8f1d-4105-ad47-c065a9fc468b-lock\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.305996 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcv9t\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-kube-api-access-qcv9t\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.308808 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383176 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383223 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383304 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383323 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383351 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383385 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.383417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpgw\" (UniqueName: \"kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.384324 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.384709 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.384833 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.388095 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.389128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.390300 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: W1202 23:00:20.395149 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f2069fd_8da0_44f4_8e80_38f2ed4f5afb.slice/crio-9cc015df0fbd428e2c75bee1066f29908ef74641fdf8b33fc3a469a50346e175 WatchSource:0}: Error finding container 9cc015df0fbd428e2c75bee1066f29908ef74641fdf8b33fc3a469a50346e175: Status 404 returned error can't find the container with id 9cc015df0fbd428e2c75bee1066f29908ef74641fdf8b33fc3a469a50346e175 Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.422353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpgw\" (UniqueName: \"kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw\") pod \"swift-ring-rebalance-kq468\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.429330 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" event={"ID":"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb","Type":"ContainerStarted","Data":"9cc015df0fbd428e2c75bee1066f29908ef74641fdf8b33fc3a469a50346e175"} Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.430844 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerID="61faf3799cade00d76f6f556cecd8a6d1e003ea4b965e0f6b5ec2c6dd2210bee" exitCode=0 Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.430941 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerDied","Data":"61faf3799cade00d76f6f556cecd8a6d1e003ea4b965e0f6b5ec2c6dd2210bee"} Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.431761 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="dnsmasq-dns" containerID="cri-o://bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329" gracePeriod=10 Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.513079 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:20 crc kubenswrapper[4696]: I1202 23:00:20.798238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.798824 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.798840 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:20 crc kubenswrapper[4696]: E1202 23:00:20.798889 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:21.798870288 +0000 UTC m=+1084.679550289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.006365 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.105349 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h585f\" (UniqueName: \"kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f\") pod \"989c3427-3369-49e2-b8a9-d9706298c06a\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.105504 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc\") pod \"989c3427-3369-49e2-b8a9-d9706298c06a\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.105693 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb\") pod \"989c3427-3369-49e2-b8a9-d9706298c06a\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.105748 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config\") pod \"989c3427-3369-49e2-b8a9-d9706298c06a\" (UID: \"989c3427-3369-49e2-b8a9-d9706298c06a\") " Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.145422 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f" (OuterVolumeSpecName: "kube-api-access-h585f") pod "989c3427-3369-49e2-b8a9-d9706298c06a" (UID: "989c3427-3369-49e2-b8a9-d9706298c06a"). InnerVolumeSpecName "kube-api-access-h585f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.155990 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kq468"] Dec 02 23:00:21 crc kubenswrapper[4696]: W1202 23:00:21.164200 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod955c99b3_ad42_4e65_a391_47eda1c4130a.slice/crio-51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c WatchSource:0}: Error finding container 51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c: Status 404 returned error can't find the container with id 51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.202991 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "989c3427-3369-49e2-b8a9-d9706298c06a" (UID: "989c3427-3369-49e2-b8a9-d9706298c06a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.205031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config" (OuterVolumeSpecName: "config") pod "989c3427-3369-49e2-b8a9-d9706298c06a" (UID: "989c3427-3369-49e2-b8a9-d9706298c06a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.208081 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.208107 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.208119 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h585f\" (UniqueName: \"kubernetes.io/projected/989c3427-3369-49e2-b8a9-d9706298c06a-kube-api-access-h585f\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.208431 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "989c3427-3369-49e2-b8a9-d9706298c06a" (UID: "989c3427-3369-49e2-b8a9-d9706298c06a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.310238 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/989c3427-3369-49e2-b8a9-d9706298c06a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.444651 4696 generic.go:334] "Generic (PLEG): container finished" podID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerID="78dc9e001fa155c0aeda1f4103ed0335cc2c9c14afdd03437d05ffe6b31edbb3" exitCode=0 Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.445674 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kq468" event={"ID":"955c99b3-ad42-4e65-a391-47eda1c4130a","Type":"ContainerStarted","Data":"51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.445720 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" event={"ID":"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb","Type":"ContainerDied","Data":"78dc9e001fa155c0aeda1f4103ed0335cc2c9c14afdd03437d05ffe6b31edbb3"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.447711 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87dfe190-5b7f-48c2-bfa0-97ca227eabb2","Type":"ContainerStarted","Data":"40c74f539fb2288fc59ad6cb037ae13a093bb32dd2a2fc040932132991bed487"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.447860 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87dfe190-5b7f-48c2-bfa0-97ca227eabb2","Type":"ContainerStarted","Data":"6da3265ffe49541fa14541dbd24d88cabe1e93dc5239d8bfb3c68de967397cdb"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.447912 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.458393 4696 generic.go:334] "Generic (PLEG): container finished" podID="989c3427-3369-49e2-b8a9-d9706298c06a" containerID="bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329" exitCode=0 Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.458586 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.459557 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerDied","Data":"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.459616 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-8gns4" event={"ID":"989c3427-3369-49e2-b8a9-d9706298c06a","Type":"ContainerDied","Data":"1f44f3c46adb6f83b05146df450595f7fce200d53545b0cab38e7f03fec76209"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.459643 4696 scope.go:117] "RemoveContainer" containerID="bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.468656 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-742ff" event={"ID":"da62a579-5b70-47cb-8666-2b6a785a2052","Type":"ContainerStarted","Data":"52b0792855a8c17268e6b32dbfa5b26e4de4b01af5822114688f0f979e379798"} Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.469097 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.495158 4696 scope.go:117] "RemoveContainer" containerID="d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.519612 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.529499 4696 scope.go:117] "RemoveContainer" containerID="bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.531905 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-8gns4"] Dec 02 23:00:21 crc kubenswrapper[4696]: E1202 23:00:21.532311 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329\": container with ID starting with bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329 not found: ID does not exist" containerID="bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.532356 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329"} err="failed to get container status \"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329\": rpc error: code = NotFound desc = could not find container \"bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329\": container with ID starting with bd818a20495a5ecb2da95a17f482fc4f6e397906eed52b84edee25606dd0b329 not found: ID does not exist" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.532390 4696 scope.go:117] "RemoveContainer" containerID="d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62" Dec 02 23:00:21 crc kubenswrapper[4696]: E1202 23:00:21.532895 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62\": container with ID starting with d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62 not found: ID does not exist" containerID="d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.532918 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62"} err="failed to get container status \"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62\": rpc error: code = NotFound desc = could not find container \"d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62\": container with ID starting with d710e642773512bb0d7651c5cd16fbc51d564437bdfe7c38ecba46f1bdebac62 not found: ID does not exist" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.539073 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-742ff" podStartSLOduration=5.539042261 podStartE2EDuration="5.539042261s" podCreationTimestamp="2025-12-02 23:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:21.529855932 +0000 UTC m=+1084.410535933" watchObservedRunningTime="2025-12-02 23:00:21.539042261 +0000 UTC m=+1084.419722262" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.560607 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.520218695 podStartE2EDuration="5.560547829s" podCreationTimestamp="2025-12-02 23:00:16 +0000 UTC" firstStartedPulling="2025-12-02 23:00:18.417926944 +0000 UTC m=+1081.298606935" lastFinishedPulling="2025-12-02 23:00:20.458256068 +0000 UTC m=+1083.338936069" observedRunningTime="2025-12-02 23:00:21.553204121 +0000 UTC m=+1084.433884122" watchObservedRunningTime="2025-12-02 23:00:21.560547829 +0000 UTC m=+1084.441227830" Dec 02 23:00:21 crc kubenswrapper[4696]: I1202 23:00:21.819842 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:21 crc kubenswrapper[4696]: E1202 23:00:21.820088 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:21 crc kubenswrapper[4696]: E1202 23:00:21.820107 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:21 crc kubenswrapper[4696]: E1202 23:00:21.820164 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:23.82014655 +0000 UTC m=+1086.700826551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:22 crc kubenswrapper[4696]: I1202 23:00:22.486152 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" event={"ID":"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb","Type":"ContainerStarted","Data":"2a868d1938f300e9a97dcd24fbcbe189967427c40cfe01e1fd66f5ba231411f1"} Dec 02 23:00:22 crc kubenswrapper[4696]: I1202 23:00:22.487984 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:22 crc kubenswrapper[4696]: I1202 23:00:22.536958 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" podStartSLOduration=3.536938224 podStartE2EDuration="3.536938224s" podCreationTimestamp="2025-12-02 23:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:22.523968518 +0000 UTC m=+1085.404648529" watchObservedRunningTime="2025-12-02 23:00:22.536938224 +0000 UTC m=+1085.417618225" Dec 02 23:00:23 crc kubenswrapper[4696]: I1202 23:00:23.445415 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" path="/var/lib/kubelet/pods/989c3427-3369-49e2-b8a9-d9706298c06a/volumes" Dec 02 23:00:23 crc kubenswrapper[4696]: I1202 23:00:23.705961 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 23:00:23 crc kubenswrapper[4696]: I1202 23:00:23.807085 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 23:00:23 crc kubenswrapper[4696]: I1202 23:00:23.864589 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:23 crc kubenswrapper[4696]: E1202 23:00:23.864827 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:23 crc kubenswrapper[4696]: E1202 23:00:23.864861 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:23 crc kubenswrapper[4696]: E1202 23:00:23.864950 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:27.86492312 +0000 UTC m=+1090.745603121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.598562 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ca12-account-create-update-l9pzc"] Dec 02 23:00:26 crc kubenswrapper[4696]: E1202 23:00:26.599435 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="init" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.599451 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="init" Dec 02 23:00:26 crc kubenswrapper[4696]: E1202 23:00:26.599482 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="dnsmasq-dns" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.599489 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="dnsmasq-dns" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.599689 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="989c3427-3369-49e2-b8a9-d9706298c06a" containerName="dnsmasq-dns" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.600433 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.602788 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.612110 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ca12-account-create-update-l9pzc"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.637701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.637808 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c2qs\" (UniqueName: \"kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.655131 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vfwzg"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.656831 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.670120 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vfwzg"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.740427 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.740877 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c2qs\" (UniqueName: \"kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.741070 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj742\" (UniqueName: \"kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.741199 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.742130 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.766940 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c2qs\" (UniqueName: \"kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs\") pod \"keystone-ca12-account-create-update-l9pzc\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.842235 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.842394 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj742\" (UniqueName: \"kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.843147 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.862676 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj742\" (UniqueName: \"kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742\") pod \"keystone-db-create-vfwzg\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.882346 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-v8bm6"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.883572 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.896636 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v8bm6"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.922590 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.943790 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfpkw\" (UniqueName: \"kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.943869 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.951077 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-37ae-account-create-update-28whg"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.952657 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.956848 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.960242 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-37ae-account-create-update-28whg"] Dec 02 23:00:26 crc kubenswrapper[4696]: I1202 23:00:26.996154 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.045693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdg7\" (UniqueName: \"kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.046172 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.046313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfpkw\" (UniqueName: \"kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.046432 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.047285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.065536 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfpkw\" (UniqueName: \"kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw\") pod \"placement-db-create-v8bm6\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.147376 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdg7\" (UniqueName: \"kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.147477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.148495 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.175392 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdg7\" (UniqueName: \"kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7\") pod \"placement-37ae-account-create-update-28whg\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.224865 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.272790 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.303656 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-shqwd"] Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.305665 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.314535 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-shqwd"] Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.350751 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.350883 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jxv\" (UniqueName: \"kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.368967 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.462123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.464232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jxv\" (UniqueName: \"kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.464991 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.477150 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bcff-account-create-update-zdbzv"] Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.480499 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.495554 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.496976 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bcff-account-create-update-zdbzv"] Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.514871 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jxv\" (UniqueName: \"kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv\") pod \"glance-db-create-shqwd\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.566202 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhg6\" (UniqueName: \"kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.566247 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.632252 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-shqwd" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.667876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhg6\" (UniqueName: \"kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.667925 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.670987 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.701630 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhg6\" (UniqueName: \"kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6\") pod \"glance-bcff-account-create-update-zdbzv\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.867483 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:27 crc kubenswrapper[4696]: I1202 23:00:27.871085 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:27 crc kubenswrapper[4696]: E1202 23:00:27.871284 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:27 crc kubenswrapper[4696]: E1202 23:00:27.871315 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:27 crc kubenswrapper[4696]: E1202 23:00:27.871376 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:35.871353971 +0000 UTC m=+1098.752033972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.018937 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-24sjs"] Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.020205 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.067807 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-24sjs"] Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.104299 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-c4e5-account-create-update-j2qmq"] Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.105564 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.108270 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.116610 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-c4e5-account-create-update-j2qmq"] Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.199831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxl9\" (UniqueName: \"kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.199913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.200150 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.200199 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jrgh\" (UniqueName: \"kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.301614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slxl9\" (UniqueName: \"kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.301770 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.301918 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.301948 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jrgh\" (UniqueName: \"kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.302708 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.302893 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.322591 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxl9\" (UniqueName: \"kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9\") pod \"watcher-c4e5-account-create-update-j2qmq\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.322855 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jrgh\" (UniqueName: \"kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh\") pod \"watcher-db-create-24sjs\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.354321 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.429045 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.537057 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.590239 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:29 crc kubenswrapper[4696]: I1202 23:00:29.591529 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-742ff" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" containerID="cri-o://52b0792855a8c17268e6b32dbfa5b26e4de4b01af5822114688f0f979e379798" gracePeriod=10 Dec 02 23:00:30 crc kubenswrapper[4696]: I1202 23:00:30.586252 4696 generic.go:334] "Generic (PLEG): container finished" podID="da62a579-5b70-47cb-8666-2b6a785a2052" containerID="52b0792855a8c17268e6b32dbfa5b26e4de4b01af5822114688f0f979e379798" exitCode=0 Dec 02 23:00:30 crc kubenswrapper[4696]: I1202 23:00:30.587886 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-742ff" event={"ID":"da62a579-5b70-47cb-8666-2b6a785a2052","Type":"ContainerDied","Data":"52b0792855a8c17268e6b32dbfa5b26e4de4b01af5822114688f0f979e379798"} Dec 02 23:00:32 crc kubenswrapper[4696]: I1202 23:00:32.319946 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 23:00:32 crc kubenswrapper[4696]: I1202 23:00:32.361490 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-742ff" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Dec 02 23:00:34 crc kubenswrapper[4696]: I1202 23:00:34.630355 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerID="72dadaa892b741f7a71d45cdf9cad76ee8227c1f1237e87e17a38b7792ae3aa3" exitCode=0 Dec 02 23:00:34 crc kubenswrapper[4696]: I1202 23:00:34.630460 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerDied","Data":"72dadaa892b741f7a71d45cdf9cad76ee8227c1f1237e87e17a38b7792ae3aa3"} Dec 02 23:00:35 crc kubenswrapper[4696]: I1202 23:00:35.936372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:35 crc kubenswrapper[4696]: E1202 23:00:35.936642 4696 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:00:35 crc kubenswrapper[4696]: E1202 23:00:35.937352 4696 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:00:35 crc kubenswrapper[4696]: E1202 23:00:35.937423 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift podName:d5594f21-8f1d-4105-ad47-c065a9fc468b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:51.937401344 +0000 UTC m=+1114.818081345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift") pod "swift-storage-0" (UID: "d5594f21-8f1d-4105-ad47-c065a9fc468b") : configmap "swift-ring-files" not found Dec 02 23:00:40 crc kubenswrapper[4696]: E1202 23:00:40.930927 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b" Dec 02 23:00:40 crc kubenswrapper[4696]: E1202 23:00:40.932074 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4f75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(f5b9aee9-4e9a-4d60-be32-f25d230622bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.978366 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.987765 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc\") pod \"da62a579-5b70-47cb-8666-2b6a785a2052\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.987894 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb\") pod \"da62a579-5b70-47cb-8666-2b6a785a2052\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.987969 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config\") pod \"da62a579-5b70-47cb-8666-2b6a785a2052\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.988045 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb\") pod \"da62a579-5b70-47cb-8666-2b6a785a2052\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.988125 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtn9\" (UniqueName: \"kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9\") pod \"da62a579-5b70-47cb-8666-2b6a785a2052\" (UID: \"da62a579-5b70-47cb-8666-2b6a785a2052\") " Dec 02 23:00:40 crc kubenswrapper[4696]: I1202 23:00:40.995161 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9" (OuterVolumeSpecName: "kube-api-access-dgtn9") pod "da62a579-5b70-47cb-8666-2b6a785a2052" (UID: "da62a579-5b70-47cb-8666-2b6a785a2052"). InnerVolumeSpecName "kube-api-access-dgtn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.059516 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config" (OuterVolumeSpecName: "config") pod "da62a579-5b70-47cb-8666-2b6a785a2052" (UID: "da62a579-5b70-47cb-8666-2b6a785a2052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.066173 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da62a579-5b70-47cb-8666-2b6a785a2052" (UID: "da62a579-5b70-47cb-8666-2b6a785a2052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.072975 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da62a579-5b70-47cb-8666-2b6a785a2052" (UID: "da62a579-5b70-47cb-8666-2b6a785a2052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.090492 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.090542 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.090556 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.090569 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtn9\" (UniqueName: \"kubernetes.io/projected/da62a579-5b70-47cb-8666-2b6a785a2052-kube-api-access-dgtn9\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.091189 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da62a579-5b70-47cb-8666-2b6a785a2052" (UID: "da62a579-5b70-47cb-8666-2b6a785a2052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.192160 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da62a579-5b70-47cb-8666-2b6a785a2052-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.706703 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kq468" event={"ID":"955c99b3-ad42-4e65-a391-47eda1c4130a","Type":"ContainerStarted","Data":"9d72e7d1ff35077292b75af1a734d96ae83a3e6a635d80ee42c9f231e430fc4c"} Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.710149 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerStarted","Data":"0134045e0742110b4505830045b5e3aaf129f3b5a4702f5058b74a7956add855"} Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.710692 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.715826 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-742ff" event={"ID":"da62a579-5b70-47cb-8666-2b6a785a2052","Type":"ContainerDied","Data":"acbb3ea80be119b5bc913a193b4a0b6109ea9649cc7cafd859e6d4d3a4770798"} Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.715887 4696 scope.go:117] "RemoveContainer" containerID="52b0792855a8c17268e6b32dbfa5b26e4de4b01af5822114688f0f979e379798" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.716088 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-742ff" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.737058 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kq468" podStartSLOduration=1.882022739 podStartE2EDuration="21.73703672s" podCreationTimestamp="2025-12-02 23:00:20 +0000 UTC" firstStartedPulling="2025-12-02 23:00:21.169223517 +0000 UTC m=+1084.049903518" lastFinishedPulling="2025-12-02 23:00:41.024237498 +0000 UTC m=+1103.904917499" observedRunningTime="2025-12-02 23:00:41.735423174 +0000 UTC m=+1104.616103175" watchObservedRunningTime="2025-12-02 23:00:41.73703672 +0000 UTC m=+1104.617716731" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.750836 4696 scope.go:117] "RemoveContainer" containerID="082cb5c7fb9fbddee7810c5efd7e7bc1062252c769e8bd1d203f6f3a714a151b" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.780378 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.504816708999996 podStartE2EDuration="1m9.780347583s" podCreationTimestamp="2025-12-02 22:59:32 +0000 UTC" firstStartedPulling="2025-12-02 22:59:34.726159692 +0000 UTC m=+1037.606839693" lastFinishedPulling="2025-12-02 22:59:59.001690536 +0000 UTC m=+1061.882370567" observedRunningTime="2025-12-02 23:00:41.771921995 +0000 UTC m=+1104.652601986" watchObservedRunningTime="2025-12-02 23:00:41.780347583 +0000 UTC m=+1104.661027584" Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.804429 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.812178 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-742ff"] Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.944297 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v8bm6"] Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.955947 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-24sjs"] Dec 02 23:00:41 crc kubenswrapper[4696]: W1202 23:00:41.971891 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219c506b_9aa0_4926_9085_ff99b291382b.slice/crio-746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680 WatchSource:0}: Error finding container 746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680: Status 404 returned error can't find the container with id 746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680 Dec 02 23:00:41 crc kubenswrapper[4696]: I1202 23:00:41.974941 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-shqwd"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.002414 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bcff-account-create-update-zdbzv"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.019338 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-c4e5-account-create-update-j2qmq"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.026788 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ca12-account-create-update-l9pzc"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.039489 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vfwzg"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.050821 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-37ae-account-create-update-28whg"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.263802 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-btsm6" podUID="b13b6998-c04a-4ac8-9615-5078f1169ecb" containerName="ovn-controller" probeResult="failure" output=< Dec 02 23:00:42 crc kubenswrapper[4696]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 23:00:42 crc kubenswrapper[4696]: > Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.290923 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h54st" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.361873 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-742ff" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.421615 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h54st" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.664644 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-btsm6-config-nj9sf"] Dec 02 23:00:42 crc kubenswrapper[4696]: E1202 23:00:42.665192 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="init" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.665222 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="init" Dec 02 23:00:42 crc kubenswrapper[4696]: E1202 23:00:42.665282 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.665292 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.665546 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" containerName="dnsmasq-dns" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.666309 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.670345 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.688962 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6-config-nj9sf"] Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.757143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vfwzg" event={"ID":"a2016daf-7a4a-4f02-b75e-af8116362fe6","Type":"ContainerStarted","Data":"f55bc4e52aefdaa21fc0f2e2c2ef3eb4bab6add99c1f315980e31ef8f2579758"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.757228 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vfwzg" event={"ID":"a2016daf-7a4a-4f02-b75e-af8116362fe6","Type":"ContainerStarted","Data":"b3e719f0787da53f7e6096f54a57f8443e9b4461bfac13bbd716f76ae83ac85b"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.759657 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ca12-account-create-update-l9pzc" event={"ID":"99ced705-305f-4c70-80e4-4854e66dabe0","Type":"ContainerStarted","Data":"ea9374afdfb098bdf1d68509a2314eb08d0a57a55c6cb0fc98f5d0e9fc9481a4"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.759692 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ca12-account-create-update-l9pzc" event={"ID":"99ced705-305f-4c70-80e4-4854e66dabe0","Type":"ContainerStarted","Data":"15df8686307634bad70c284c9d40e84bd08d5a6deabb6621674341f1ddc68806"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.762349 4696 generic.go:334] "Generic (PLEG): container finished" podID="01fdd915-7b63-4733-85bb-06547a93c18a" containerID="002bf9c227b9a1484115b2d2342e45d25be7786dbfdec1c37bb3b0f260de7e2d" exitCode=0 Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.762445 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v8bm6" event={"ID":"01fdd915-7b63-4733-85bb-06547a93c18a","Type":"ContainerDied","Data":"002bf9c227b9a1484115b2d2342e45d25be7786dbfdec1c37bb3b0f260de7e2d"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.762479 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v8bm6" event={"ID":"01fdd915-7b63-4733-85bb-06547a93c18a","Type":"ContainerStarted","Data":"4d8b1f201e09e46997841036cbfa8e176fc17366b23da6c154e05a416e08ff30"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.766263 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-c4e5-account-create-update-j2qmq" event={"ID":"769b3c83-63f8-4a20-b62a-6404415cb7de","Type":"ContainerStarted","Data":"0af465ddc5fbe6eb9804f2509398ddec0355946a8e8d985ee9e5354e98d5b45c"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.766348 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-c4e5-account-create-update-j2qmq" event={"ID":"769b3c83-63f8-4a20-b62a-6404415cb7de","Type":"ContainerStarted","Data":"4d1ee4ff3bd757753157e554a6edc28b6d61b517696ec46d30494b595e0552dc"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.768030 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-shqwd" event={"ID":"e92140c3-85a2-4b5e-9f2a-604c46b8763f","Type":"ContainerStarted","Data":"30188643bb3f44c82ffd7efc35c1a01bcabd1efb5b658fe3c26e9cf176a7bd65"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.768060 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-shqwd" event={"ID":"e92140c3-85a2-4b5e-9f2a-604c46b8763f","Type":"ContainerStarted","Data":"a2824de7cb3e46bd8982e17dfda2dc72a757d935c9421b1951860709b874606e"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.783374 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bcff-account-create-update-zdbzv" event={"ID":"219c506b-9aa0-4926-9085-ff99b291382b","Type":"ContainerStarted","Data":"b93a710a188056df2d6ab5a27117dab4d4cc535bd5a6b7f6f4f539ebc679d19d"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.784036 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bcff-account-create-update-zdbzv" event={"ID":"219c506b-9aa0-4926-9085-ff99b291382b","Type":"ContainerStarted","Data":"746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.804627 4696 generic.go:334] "Generic (PLEG): container finished" podID="613fe2a1-c5b2-460d-8715-040e5c6f4a4a" containerID="5d968d24d4c94f5fc8459b593c9d34e74c7800aaae5cf2dc69e5276a23a2bf02" exitCode=0 Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.804902 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-24sjs" event={"ID":"613fe2a1-c5b2-460d-8715-040e5c6f4a4a","Type":"ContainerDied","Data":"5d968d24d4c94f5fc8459b593c9d34e74c7800aaae5cf2dc69e5276a23a2bf02"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.804964 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-24sjs" event={"ID":"613fe2a1-c5b2-460d-8715-040e5c6f4a4a","Type":"ContainerStarted","Data":"da8336ee8a82d46577da94566669780794d288344f5613ab13adcbdec4a9bbcc"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.806454 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-vfwzg" podStartSLOduration=16.806428121 podStartE2EDuration="16.806428121s" podCreationTimestamp="2025-12-02 23:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.786012714 +0000 UTC m=+1105.666692715" watchObservedRunningTime="2025-12-02 23:00:42.806428121 +0000 UTC m=+1105.687108122" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.823961 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37ae-account-create-update-28whg" event={"ID":"8b25ddeb-9306-4ed4-8bd8-b83f9e500985","Type":"ContainerStarted","Data":"be738cddc4863865924a2c01bb3ac8808315f23c435e60c56a08f046aa456f6b"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.824009 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37ae-account-create-update-28whg" event={"ID":"8b25ddeb-9306-4ed4-8bd8-b83f9e500985","Type":"ContainerStarted","Data":"ae838d750565b0667baa2ddbda31cab7dae3c5a115b944bbc9cf651e03899c29"} Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.827908 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-c4e5-account-create-update-j2qmq" podStartSLOduration=13.827887457 podStartE2EDuration="13.827887457s" podCreationTimestamp="2025-12-02 23:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.818052589 +0000 UTC m=+1105.698732590" watchObservedRunningTime="2025-12-02 23:00:42.827887457 +0000 UTC m=+1105.708567458" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833011 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlmtn\" (UniqueName: \"kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833172 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833585 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.833624 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.865738 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-shqwd" podStartSLOduration=15.865719505 podStartE2EDuration="15.865719505s" podCreationTimestamp="2025-12-02 23:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.865014005 +0000 UTC m=+1105.745694006" watchObservedRunningTime="2025-12-02 23:00:42.865719505 +0000 UTC m=+1105.746399506" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.867039 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bcff-account-create-update-zdbzv" podStartSLOduration=15.867030352 podStartE2EDuration="15.867030352s" podCreationTimestamp="2025-12-02 23:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.842666804 +0000 UTC m=+1105.723346805" watchObservedRunningTime="2025-12-02 23:00:42.867030352 +0000 UTC m=+1105.747710353" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.915115 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ca12-account-create-update-l9pzc" podStartSLOduration=16.91508938 podStartE2EDuration="16.91508938s" podCreationTimestamp="2025-12-02 23:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.902131524 +0000 UTC m=+1105.782811515" watchObservedRunningTime="2025-12-02 23:00:42.91508938 +0000 UTC m=+1105.795769381" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.927266 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-37ae-account-create-update-28whg" podStartSLOduration=16.927245933000002 podStartE2EDuration="16.927245933s" podCreationTimestamp="2025-12-02 23:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:42.926933944 +0000 UTC m=+1105.807613935" watchObservedRunningTime="2025-12-02 23:00:42.927245933 +0000 UTC m=+1105.807925934" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.939247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlmtn\" (UniqueName: \"kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.939636 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.939758 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.939862 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.940200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.940303 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.941606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.942288 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.942451 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.943504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.945445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:42 crc kubenswrapper[4696]: I1202 23:00:42.972658 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlmtn\" (UniqueName: \"kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn\") pod \"ovn-controller-btsm6-config-nj9sf\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.093022 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.441619 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da62a579-5b70-47cb-8666-2b6a785a2052" path="/var/lib/kubelet/pods/da62a579-5b70-47cb-8666-2b6a785a2052/volumes" Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.613132 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6-config-nj9sf"] Dec 02 23:00:43 crc kubenswrapper[4696]: W1202 23:00:43.634882 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3643d5cc_74ac_41b0_8f0e_a15e6ebbd87b.slice/crio-b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2 WatchSource:0}: Error finding container b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2: Status 404 returned error can't find the container with id b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.840244 4696 generic.go:334] "Generic (PLEG): container finished" podID="769b3c83-63f8-4a20-b62a-6404415cb7de" containerID="0af465ddc5fbe6eb9804f2509398ddec0355946a8e8d985ee9e5354e98d5b45c" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.840354 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-c4e5-account-create-update-j2qmq" event={"ID":"769b3c83-63f8-4a20-b62a-6404415cb7de","Type":"ContainerDied","Data":"0af465ddc5fbe6eb9804f2509398ddec0355946a8e8d985ee9e5354e98d5b45c"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.847156 4696 generic.go:334] "Generic (PLEG): container finished" podID="e92140c3-85a2-4b5e-9f2a-604c46b8763f" containerID="30188643bb3f44c82ffd7efc35c1a01bcabd1efb5b658fe3c26e9cf176a7bd65" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.847229 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-shqwd" event={"ID":"e92140c3-85a2-4b5e-9f2a-604c46b8763f","Type":"ContainerDied","Data":"30188643bb3f44c82ffd7efc35c1a01bcabd1efb5b658fe3c26e9cf176a7bd65"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.851395 4696 generic.go:334] "Generic (PLEG): container finished" podID="99ced705-305f-4c70-80e4-4854e66dabe0" containerID="ea9374afdfb098bdf1d68509a2314eb08d0a57a55c6cb0fc98f5d0e9fc9481a4" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.851475 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ca12-account-create-update-l9pzc" event={"ID":"99ced705-305f-4c70-80e4-4854e66dabe0","Type":"ContainerDied","Data":"ea9374afdfb098bdf1d68509a2314eb08d0a57a55c6cb0fc98f5d0e9fc9481a4"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.854359 4696 generic.go:334] "Generic (PLEG): container finished" podID="219c506b-9aa0-4926-9085-ff99b291382b" containerID="b93a710a188056df2d6ab5a27117dab4d4cc535bd5a6b7f6f4f539ebc679d19d" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.854476 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bcff-account-create-update-zdbzv" event={"ID":"219c506b-9aa0-4926-9085-ff99b291382b","Type":"ContainerDied","Data":"b93a710a188056df2d6ab5a27117dab4d4cc535bd5a6b7f6f4f539ebc679d19d"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.856587 4696 generic.go:334] "Generic (PLEG): container finished" podID="a2016daf-7a4a-4f02-b75e-af8116362fe6" containerID="f55bc4e52aefdaa21fc0f2e2c2ef3eb4bab6add99c1f315980e31ef8f2579758" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.856653 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vfwzg" event={"ID":"a2016daf-7a4a-4f02-b75e-af8116362fe6","Type":"ContainerDied","Data":"f55bc4e52aefdaa21fc0f2e2c2ef3eb4bab6add99c1f315980e31ef8f2579758"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.868338 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-nj9sf" event={"ID":"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b","Type":"ContainerStarted","Data":"b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2"} Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.872246 4696 generic.go:334] "Generic (PLEG): container finished" podID="8b25ddeb-9306-4ed4-8bd8-b83f9e500985" containerID="be738cddc4863865924a2c01bb3ac8808315f23c435e60c56a08f046aa456f6b" exitCode=0 Dec 02 23:00:43 crc kubenswrapper[4696]: I1202 23:00:43.872561 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37ae-account-create-update-28whg" event={"ID":"8b25ddeb-9306-4ed4-8bd8-b83f9e500985","Type":"ContainerDied","Data":"be738cddc4863865924a2c01bb3ac8808315f23c435e60c56a08f046aa456f6b"} Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.372520 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.377469 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.483392 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jrgh\" (UniqueName: \"kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh\") pod \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.483471 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfpkw\" (UniqueName: \"kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw\") pod \"01fdd915-7b63-4733-85bb-06547a93c18a\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.483590 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts\") pod \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\" (UID: \"613fe2a1-c5b2-460d-8715-040e5c6f4a4a\") " Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.483778 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts\") pod \"01fdd915-7b63-4733-85bb-06547a93c18a\" (UID: \"01fdd915-7b63-4733-85bb-06547a93c18a\") " Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.486364 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "613fe2a1-c5b2-460d-8715-040e5c6f4a4a" (UID: "613fe2a1-c5b2-460d-8715-040e5c6f4a4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.488257 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01fdd915-7b63-4733-85bb-06547a93c18a" (UID: "01fdd915-7b63-4733-85bb-06547a93c18a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.500293 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw" (OuterVolumeSpecName: "kube-api-access-lfpkw") pod "01fdd915-7b63-4733-85bb-06547a93c18a" (UID: "01fdd915-7b63-4733-85bb-06547a93c18a"). InnerVolumeSpecName "kube-api-access-lfpkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.501488 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh" (OuterVolumeSpecName: "kube-api-access-8jrgh") pod "613fe2a1-c5b2-460d-8715-040e5c6f4a4a" (UID: "613fe2a1-c5b2-460d-8715-040e5c6f4a4a"). InnerVolumeSpecName "kube-api-access-8jrgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.586311 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jrgh\" (UniqueName: \"kubernetes.io/projected/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-kube-api-access-8jrgh\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.586359 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfpkw\" (UniqueName: \"kubernetes.io/projected/01fdd915-7b63-4733-85bb-06547a93c18a-kube-api-access-lfpkw\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.586371 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/613fe2a1-c5b2-460d-8715-040e5c6f4a4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.586381 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01fdd915-7b63-4733-85bb-06547a93c18a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.884132 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v8bm6" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.884124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v8bm6" event={"ID":"01fdd915-7b63-4733-85bb-06547a93c18a","Type":"ContainerDied","Data":"4d8b1f201e09e46997841036cbfa8e176fc17366b23da6c154e05a416e08ff30"} Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.884608 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d8b1f201e09e46997841036cbfa8e176fc17366b23da6c154e05a416e08ff30" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.887447 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerStarted","Data":"32033a2cdc8344e3cd2044fa00f0ba27b75e3fdbdb681cf0180638376133846b"} Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.890141 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-24sjs" event={"ID":"613fe2a1-c5b2-460d-8715-040e5c6f4a4a","Type":"ContainerDied","Data":"da8336ee8a82d46577da94566669780794d288344f5613ab13adcbdec4a9bbcc"} Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.890230 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da8336ee8a82d46577da94566669780794d288344f5613ab13adcbdec4a9bbcc" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.890378 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-24sjs" Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.892405 4696 generic.go:334] "Generic (PLEG): container finished" podID="3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" containerID="c21c71b6337a1ce2a71c7b36736be88b11a1b7649d078b9bc5a9a24cd00c8afa" exitCode=0 Dec 02 23:00:44 crc kubenswrapper[4696]: I1202 23:00:44.892963 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-nj9sf" event={"ID":"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b","Type":"ContainerDied","Data":"c21c71b6337a1ce2a71c7b36736be88b11a1b7649d078b9bc5a9a24cd00c8afa"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.339612 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.519233 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts\") pod \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.519288 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdg7\" (UniqueName: \"kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7\") pod \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\" (UID: \"8b25ddeb-9306-4ed4-8bd8-b83f9e500985\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.520385 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b25ddeb-9306-4ed4-8bd8-b83f9e500985" (UID: "8b25ddeb-9306-4ed4-8bd8-b83f9e500985"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.526048 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7" (OuterVolumeSpecName: "kube-api-access-tmdg7") pod "8b25ddeb-9306-4ed4-8bd8-b83f9e500985" (UID: "8b25ddeb-9306-4ed4-8bd8-b83f9e500985"). InnerVolumeSpecName "kube-api-access-tmdg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.534386 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.542679 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.549291 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-shqwd" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.563030 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.567454 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.621364 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.621406 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdg7\" (UniqueName: \"kubernetes.io/projected/8b25ddeb-9306-4ed4-8bd8-b83f9e500985-kube-api-access-tmdg7\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.722611 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts\") pod \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.722856 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9jxv\" (UniqueName: \"kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv\") pod \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\" (UID: \"e92140c3-85a2-4b5e-9f2a-604c46b8763f\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.722889 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts\") pod \"769b3c83-63f8-4a20-b62a-6404415cb7de\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.722938 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhg6\" (UniqueName: \"kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6\") pod \"219c506b-9aa0-4926-9085-ff99b291382b\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.722957 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts\") pod \"219c506b-9aa0-4926-9085-ff99b291382b\" (UID: \"219c506b-9aa0-4926-9085-ff99b291382b\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723005 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c2qs\" (UniqueName: \"kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs\") pod \"99ced705-305f-4c70-80e4-4854e66dabe0\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723025 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts\") pod \"99ced705-305f-4c70-80e4-4854e66dabe0\" (UID: \"99ced705-305f-4c70-80e4-4854e66dabe0\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723115 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts\") pod \"a2016daf-7a4a-4f02-b75e-af8116362fe6\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723145 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slxl9\" (UniqueName: \"kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9\") pod \"769b3c83-63f8-4a20-b62a-6404415cb7de\" (UID: \"769b3c83-63f8-4a20-b62a-6404415cb7de\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723179 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj742\" (UniqueName: \"kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742\") pod \"a2016daf-7a4a-4f02-b75e-af8116362fe6\" (UID: \"a2016daf-7a4a-4f02-b75e-af8116362fe6\") " Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723195 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e92140c3-85a2-4b5e-9f2a-604c46b8763f" (UID: "e92140c3-85a2-4b5e-9f2a-604c46b8763f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723599 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e92140c3-85a2-4b5e-9f2a-604c46b8763f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723605 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "769b3c83-63f8-4a20-b62a-6404415cb7de" (UID: "769b3c83-63f8-4a20-b62a-6404415cb7de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723659 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99ced705-305f-4c70-80e4-4854e66dabe0" (UID: "99ced705-305f-4c70-80e4-4854e66dabe0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.723600 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "219c506b-9aa0-4926-9085-ff99b291382b" (UID: "219c506b-9aa0-4926-9085-ff99b291382b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.724082 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2016daf-7a4a-4f02-b75e-af8116362fe6" (UID: "a2016daf-7a4a-4f02-b75e-af8116362fe6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.727490 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9" (OuterVolumeSpecName: "kube-api-access-slxl9") pod "769b3c83-63f8-4a20-b62a-6404415cb7de" (UID: "769b3c83-63f8-4a20-b62a-6404415cb7de"). InnerVolumeSpecName "kube-api-access-slxl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.727548 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs" (OuterVolumeSpecName: "kube-api-access-2c2qs") pod "99ced705-305f-4c70-80e4-4854e66dabe0" (UID: "99ced705-305f-4c70-80e4-4854e66dabe0"). InnerVolumeSpecName "kube-api-access-2c2qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.728058 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6" (OuterVolumeSpecName: "kube-api-access-ljhg6") pod "219c506b-9aa0-4926-9085-ff99b291382b" (UID: "219c506b-9aa0-4926-9085-ff99b291382b"). InnerVolumeSpecName "kube-api-access-ljhg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.728353 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742" (OuterVolumeSpecName: "kube-api-access-hj742") pod "a2016daf-7a4a-4f02-b75e-af8116362fe6" (UID: "a2016daf-7a4a-4f02-b75e-af8116362fe6"). InnerVolumeSpecName "kube-api-access-hj742". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.728513 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv" (OuterVolumeSpecName: "kube-api-access-z9jxv") pod "e92140c3-85a2-4b5e-9f2a-604c46b8763f" (UID: "e92140c3-85a2-4b5e-9f2a-604c46b8763f"). InnerVolumeSpecName "kube-api-access-z9jxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.825660 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9jxv\" (UniqueName: \"kubernetes.io/projected/e92140c3-85a2-4b5e-9f2a-604c46b8763f-kube-api-access-z9jxv\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826020 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769b3c83-63f8-4a20-b62a-6404415cb7de-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826091 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhg6\" (UniqueName: \"kubernetes.io/projected/219c506b-9aa0-4926-9085-ff99b291382b-kube-api-access-ljhg6\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826149 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/219c506b-9aa0-4926-9085-ff99b291382b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826230 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c2qs\" (UniqueName: \"kubernetes.io/projected/99ced705-305f-4c70-80e4-4854e66dabe0-kube-api-access-2c2qs\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826294 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ced705-305f-4c70-80e4-4854e66dabe0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826350 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2016daf-7a4a-4f02-b75e-af8116362fe6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826411 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slxl9\" (UniqueName: \"kubernetes.io/projected/769b3c83-63f8-4a20-b62a-6404415cb7de-kube-api-access-slxl9\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.826469 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj742\" (UniqueName: \"kubernetes.io/projected/a2016daf-7a4a-4f02-b75e-af8116362fe6-kube-api-access-hj742\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.903193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ca12-account-create-update-l9pzc" event={"ID":"99ced705-305f-4c70-80e4-4854e66dabe0","Type":"ContainerDied","Data":"15df8686307634bad70c284c9d40e84bd08d5a6deabb6621674341f1ddc68806"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.903599 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15df8686307634bad70c284c9d40e84bd08d5a6deabb6621674341f1ddc68806" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.903766 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ca12-account-create-update-l9pzc" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.914278 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bcff-account-create-update-zdbzv" event={"ID":"219c506b-9aa0-4926-9085-ff99b291382b","Type":"ContainerDied","Data":"746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.914560 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="746111bf989ebe045bbf154a9bf8e813b935945ec039ca9866285e3fb8a21680" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.914360 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bcff-account-create-update-zdbzv" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.916135 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vfwzg" event={"ID":"a2016daf-7a4a-4f02-b75e-af8116362fe6","Type":"ContainerDied","Data":"b3e719f0787da53f7e6096f54a57f8443e9b4461bfac13bbd716f76ae83ac85b"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.916159 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e719f0787da53f7e6096f54a57f8443e9b4461bfac13bbd716f76ae83ac85b" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.916201 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vfwzg" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.920124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37ae-account-create-update-28whg" event={"ID":"8b25ddeb-9306-4ed4-8bd8-b83f9e500985","Type":"ContainerDied","Data":"ae838d750565b0667baa2ddbda31cab7dae3c5a115b944bbc9cf651e03899c29"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.920155 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae838d750565b0667baa2ddbda31cab7dae3c5a115b944bbc9cf651e03899c29" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.920186 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37ae-account-create-update-28whg" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.922006 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-c4e5-account-create-update-j2qmq" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.922020 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-c4e5-account-create-update-j2qmq" event={"ID":"769b3c83-63f8-4a20-b62a-6404415cb7de","Type":"ContainerDied","Data":"4d1ee4ff3bd757753157e554a6edc28b6d61b517696ec46d30494b595e0552dc"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.922235 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1ee4ff3bd757753157e554a6edc28b6d61b517696ec46d30494b595e0552dc" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.924124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-shqwd" event={"ID":"e92140c3-85a2-4b5e-9f2a-604c46b8763f","Type":"ContainerDied","Data":"a2824de7cb3e46bd8982e17dfda2dc72a757d935c9421b1951860709b874606e"} Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.924199 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-shqwd" Dec 02 23:00:45 crc kubenswrapper[4696]: I1202 23:00:45.924219 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2824de7cb3e46bd8982e17dfda2dc72a757d935c9421b1951860709b874606e" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.212561 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335275 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335341 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335405 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335436 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlmtn\" (UniqueName: \"kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335459 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335454 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335565 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn\") pod \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\" (UID: \"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b\") " Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335543 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run" (OuterVolumeSpecName: "var-run") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335622 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335969 4696 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335987 4696 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.335995 4696 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.336488 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.336645 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts" (OuterVolumeSpecName: "scripts") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.354640 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn" (OuterVolumeSpecName: "kube-api-access-jlmtn") pod "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" (UID: "3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b"). InnerVolumeSpecName "kube-api-access-jlmtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.437794 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.437835 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlmtn\" (UniqueName: \"kubernetes.io/projected/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-kube-api-access-jlmtn\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.437850 4696 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.950013 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-nj9sf" event={"ID":"3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b","Type":"ContainerDied","Data":"b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2"} Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.950078 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33ea391f8a4a8503f5c73d1a2ec56612505d40cb80234afb228b1bbdbf868e2" Dec 02 23:00:46 crc kubenswrapper[4696]: I1202 23:00:46.950189 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-nj9sf" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.207359 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-btsm6" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.340258 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-btsm6-config-nj9sf"] Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.351103 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-btsm6-config-nj9sf"] Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.449175 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" path="/var/lib/kubelet/pods/3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b/volumes" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.516158 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-btsm6-config-g4scg"] Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.541920 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" containerName="ovn-config" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.541965 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" containerName="ovn-config" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.541987 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2016daf-7a4a-4f02-b75e-af8116362fe6" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.541997 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2016daf-7a4a-4f02-b75e-af8116362fe6" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542025 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b25ddeb-9306-4ed4-8bd8-b83f9e500985" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542033 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b25ddeb-9306-4ed4-8bd8-b83f9e500985" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542058 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769b3c83-63f8-4a20-b62a-6404415cb7de" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542068 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="769b3c83-63f8-4a20-b62a-6404415cb7de" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542102 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ced705-305f-4c70-80e4-4854e66dabe0" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542111 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ced705-305f-4c70-80e4-4854e66dabe0" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542125 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fdd915-7b63-4733-85bb-06547a93c18a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542147 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fdd915-7b63-4733-85bb-06547a93c18a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542158 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92140c3-85a2-4b5e-9f2a-604c46b8763f" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542166 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92140c3-85a2-4b5e-9f2a-604c46b8763f" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542180 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613fe2a1-c5b2-460d-8715-040e5c6f4a4a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542187 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="613fe2a1-c5b2-460d-8715-040e5c6f4a4a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: E1202 23:00:47.542200 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219c506b-9aa0-4926-9085-ff99b291382b" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542209 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="219c506b-9aa0-4926-9085-ff99b291382b" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542562 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2016daf-7a4a-4f02-b75e-af8116362fe6" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542593 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b25ddeb-9306-4ed4-8bd8-b83f9e500985" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542607 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="769b3c83-63f8-4a20-b62a-6404415cb7de" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542624 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="613fe2a1-c5b2-460d-8715-040e5c6f4a4a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542642 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ced705-305f-4c70-80e4-4854e66dabe0" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542659 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3643d5cc-74ac-41b0-8f0e-a15e6ebbd87b" containerName="ovn-config" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542669 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92140c3-85a2-4b5e-9f2a-604c46b8763f" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542678 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fdd915-7b63-4733-85bb-06547a93c18a" containerName="mariadb-database-create" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.542696 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="219c506b-9aa0-4926-9085-ff99b291382b" containerName="mariadb-account-create-update" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.543450 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6-config-g4scg"] Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.543563 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.550057 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.666623 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.666819 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.666885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.666912 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.666971 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k72zm\" (UniqueName: \"kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.667014 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.762844 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lgs5l"] Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.765271 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.768468 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.768654 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.768670 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.769075 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.768676 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.769501 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.769689 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k72zm\" (UniqueName: \"kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.769763 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.769829 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.770085 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.770301 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.770654 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bvbjt" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.775928 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.789534 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lgs5l"] Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.857254 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k72zm\" (UniqueName: \"kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm\") pod \"ovn-controller-btsm6-config-g4scg\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.871187 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.871280 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kqh\" (UniqueName: \"kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.871484 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.871563 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.879239 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.973628 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.978238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.978996 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.979446 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kqh\" (UniqueName: \"kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.978669 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.983327 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.983400 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerID="b1b6c8104bd2c4548eaf0e047605a944629556d000273f410634d938caa54ca8" exitCode=0 Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.983477 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerDied","Data":"b1b6c8104bd2c4548eaf0e047605a944629556d000273f410634d938caa54ca8"} Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.983815 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:47 crc kubenswrapper[4696]: I1202 23:00:47.999071 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kqh\" (UniqueName: \"kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh\") pod \"glance-db-sync-lgs5l\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:48 crc kubenswrapper[4696]: I1202 23:00:48.089595 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lgs5l" Dec 02 23:00:48 crc kubenswrapper[4696]: I1202 23:00:48.467103 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-btsm6-config-g4scg"] Dec 02 23:00:48 crc kubenswrapper[4696]: I1202 23:00:48.751877 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lgs5l"] Dec 02 23:00:49 crc kubenswrapper[4696]: I1202 23:00:49.001925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerStarted","Data":"3fe2d4b6bb592a2ba71ca2794fe2992b29c9d86d77463744428a5f27988b2ede"} Dec 02 23:00:49 crc kubenswrapper[4696]: I1202 23:00:49.002980 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 23:00:49 crc kubenswrapper[4696]: I1202 23:00:49.036847 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371959.817959 podStartE2EDuration="1m17.036817142s" podCreationTimestamp="2025-12-02 22:59:32 +0000 UTC" firstStartedPulling="2025-12-02 22:59:34.298904698 +0000 UTC m=+1037.179584699" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:49.034582189 +0000 UTC m=+1111.915262210" watchObservedRunningTime="2025-12-02 23:00:49.036817142 +0000 UTC m=+1111.917497143" Dec 02 23:00:49 crc kubenswrapper[4696]: W1202 23:00:49.542821 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1852a5_f61c_4091_a932_8a9f6da96318.slice/crio-1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795 WatchSource:0}: Error finding container 1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795: Status 404 returned error can't find the container with id 1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795 Dec 02 23:00:49 crc kubenswrapper[4696]: W1202 23:00:49.545164 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8842e89_ec2d_4601_9e82_b12c1982a910.slice/crio-6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0 WatchSource:0}: Error finding container 6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0: Status 404 returned error can't find the container with id 6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0 Dec 02 23:00:49 crc kubenswrapper[4696]: E1202 23:00:49.888806 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.012081 4696 generic.go:334] "Generic (PLEG): container finished" podID="955c99b3-ad42-4e65-a391-47eda1c4130a" containerID="9d72e7d1ff35077292b75af1a734d96ae83a3e6a635d80ee42c9f231e430fc4c" exitCode=0 Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.012169 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kq468" event={"ID":"955c99b3-ad42-4e65-a391-47eda1c4130a","Type":"ContainerDied","Data":"9d72e7d1ff35077292b75af1a734d96ae83a3e6a635d80ee42c9f231e430fc4c"} Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.014481 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerStarted","Data":"2b2b519ef6641bf42ed0ee745a4bd559ee5d2b95ebc0d16e2c978cf423dfa65e"} Dec 02 23:00:50 crc kubenswrapper[4696]: E1202 23:00:50.016416 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.017786 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lgs5l" event={"ID":"a8842e89-ec2d-4601-9e82-b12c1982a910","Type":"ContainerStarted","Data":"6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0"} Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.019314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-g4scg" event={"ID":"6d1852a5-f61c-4091-a932-8a9f6da96318","Type":"ContainerStarted","Data":"65ca14fe80d54e327a3edabf25ddf17d55ca0f28a98fbc5f635f51e6c8cd1ed9"} Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.019387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-g4scg" event={"ID":"6d1852a5-f61c-4091-a932-8a9f6da96318","Type":"ContainerStarted","Data":"1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795"} Dec 02 23:00:50 crc kubenswrapper[4696]: I1202 23:00:50.092249 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-btsm6-config-g4scg" podStartSLOduration=3.092226338 podStartE2EDuration="3.092226338s" podCreationTimestamp="2025-12-02 23:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:50.087918076 +0000 UTC m=+1112.968598087" watchObservedRunningTime="2025-12-02 23:00:50.092226338 +0000 UTC m=+1112.972906339" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.030246 4696 generic.go:334] "Generic (PLEG): container finished" podID="6d1852a5-f61c-4091-a932-8a9f6da96318" containerID="65ca14fe80d54e327a3edabf25ddf17d55ca0f28a98fbc5f635f51e6c8cd1ed9" exitCode=0 Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.030349 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-g4scg" event={"ID":"6d1852a5-f61c-4091-a932-8a9f6da96318","Type":"ContainerDied","Data":"65ca14fe80d54e327a3edabf25ddf17d55ca0f28a98fbc5f635f51e6c8cd1ed9"} Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.377639 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452533 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452646 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452688 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452724 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452860 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452918 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.452944 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znpgw\" (UniqueName: \"kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw\") pod \"955c99b3-ad42-4e65-a391-47eda1c4130a\" (UID: \"955c99b3-ad42-4e65-a391-47eda1c4130a\") " Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.454614 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.456311 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.475447 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw" (OuterVolumeSpecName: "kube-api-access-znpgw") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "kube-api-access-znpgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.481478 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts" (OuterVolumeSpecName: "scripts") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.483463 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.499708 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.503042 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "955c99b3-ad42-4e65-a391-47eda1c4130a" (UID: "955c99b3-ad42-4e65-a391-47eda1c4130a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555342 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555385 4696 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c99b3-ad42-4e65-a391-47eda1c4130a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555399 4696 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555409 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555424 4696 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c99b3-ad42-4e65-a391-47eda1c4130a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555434 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znpgw\" (UniqueName: \"kubernetes.io/projected/955c99b3-ad42-4e65-a391-47eda1c4130a-kube-api-access-znpgw\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.555442 4696 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c99b3-ad42-4e65-a391-47eda1c4130a-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.962551 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:51 crc kubenswrapper[4696]: I1202 23:00:51.969168 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d5594f21-8f1d-4105-ad47-c065a9fc468b-etc-swift\") pod \"swift-storage-0\" (UID: \"d5594f21-8f1d-4105-ad47-c065a9fc468b\") " pod="openstack/swift-storage-0" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.048890 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kq468" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.048902 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kq468" event={"ID":"955c99b3-ad42-4e65-a391-47eda1c4130a","Type":"ContainerDied","Data":"51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c"} Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.048968 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51419262f9369e9884f9a9edbe5b90656f63d6adda7ef722035f7f3c839dd24c" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.249934 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.497770 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.574833 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575012 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k72zm\" (UniqueName: \"kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575049 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575104 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575138 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575169 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn\") pod \"6d1852a5-f61c-4091-a932-8a9f6da96318\" (UID: \"6d1852a5-f61c-4091-a932-8a9f6da96318\") " Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.575973 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.576011 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.576021 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run" (OuterVolumeSpecName: "var-run") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.576945 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.578093 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts" (OuterVolumeSpecName: "scripts") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.583209 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm" (OuterVolumeSpecName: "kube-api-access-k72zm") pod "6d1852a5-f61c-4091-a932-8a9f6da96318" (UID: "6d1852a5-f61c-4091-a932-8a9f6da96318"). InnerVolumeSpecName "kube-api-access-k72zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677553 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k72zm\" (UniqueName: \"kubernetes.io/projected/6d1852a5-f61c-4091-a932-8a9f6da96318-kube-api-access-k72zm\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677604 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677619 4696 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677628 4696 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677641 4696 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d1852a5-f61c-4091-a932-8a9f6da96318-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.677653 4696 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1852a5-f61c-4091-a932-8a9f6da96318-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:52 crc kubenswrapper[4696]: I1202 23:00:52.935470 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.061670 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"6b30b3ef0a1c9a55c456b76d183e7f9d894ce0d0d7e38dd57581e0012ade4818"} Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.063533 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-btsm6-config-g4scg" event={"ID":"6d1852a5-f61c-4091-a932-8a9f6da96318","Type":"ContainerDied","Data":"1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795"} Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.063563 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd14a4d7905c2d41924c7a2c3db17c1384bbf7889693a45083790c4e41be795" Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.063615 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-btsm6-config-g4scg" Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.605424 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-btsm6-config-g4scg"] Dec 02 23:00:53 crc kubenswrapper[4696]: I1202 23:00:53.612234 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-btsm6-config-g4scg"] Dec 02 23:00:54 crc kubenswrapper[4696]: I1202 23:00:54.077533 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerStarted","Data":"3091c57e10ca01574704dc4fa3fc645cf3fcfec3d4ac7ce3dac91506c74d67cb"} Dec 02 23:00:54 crc kubenswrapper[4696]: I1202 23:00:54.100830 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:00:54 crc kubenswrapper[4696]: I1202 23:00:54.115875 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.182304589 podStartE2EDuration="1m16.115854545s" podCreationTimestamp="2025-12-02 22:59:38 +0000 UTC" firstStartedPulling="2025-12-02 22:59:59.775548151 +0000 UTC m=+1062.656228152" lastFinishedPulling="2025-12-02 23:00:53.709098117 +0000 UTC m=+1116.589778108" observedRunningTime="2025-12-02 23:00:54.107047856 +0000 UTC m=+1116.987727857" watchObservedRunningTime="2025-12-02 23:00:54.115854545 +0000 UTC m=+1116.996534546" Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.096973 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"0243addc97e5b0a3cf896e572c78ad5bd70f9b7ca299dcc163062774f707e85d"} Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.097444 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"c9a8accf3136ebacb4a9e0fa3c2884ad0cea5db5bb794ed10c540b15db0382b9"} Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.443895 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1852a5-f61c-4091-a932-8a9f6da96318" path="/var/lib/kubelet/pods/6d1852a5-f61c-4091-a932-8a9f6da96318/volumes" Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.682189 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.682249 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:00:55 crc kubenswrapper[4696]: I1202 23:00:55.685566 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:00:56 crc kubenswrapper[4696]: I1202 23:00:56.107955 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"5e8075767e22a1d6f3a336a14f287df53af4d10a379f60070376d5eb0ab0324c"} Dec 02 23:00:56 crc kubenswrapper[4696]: I1202 23:00:56.108026 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"20e21b26912714d8344164467168f9d652158d1d493a749cafaa76f9dc16936a"} Dec 02 23:00:56 crc kubenswrapper[4696]: I1202 23:00:56.110632 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:00:58 crc kubenswrapper[4696]: I1202 23:00:58.761448 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:00:58 crc kubenswrapper[4696]: I1202 23:00:58.762602 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="config-reloader" containerID="cri-o://32033a2cdc8344e3cd2044fa00f0ba27b75e3fdbdb681cf0180638376133846b" gracePeriod=600 Dec 02 23:00:58 crc kubenswrapper[4696]: I1202 23:00:58.762751 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" containerID="cri-o://3091c57e10ca01574704dc4fa3fc645cf3fcfec3d4ac7ce3dac91506c74d67cb" gracePeriod=600 Dec 02 23:00:58 crc kubenswrapper[4696]: I1202 23:00:58.762795 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="thanos-sidecar" containerID="cri-o://2b2b519ef6641bf42ed0ee745a4bd559ee5d2b95ebc0d16e2c978cf423dfa65e" gracePeriod=600 Dec 02 23:01:00 crc kubenswrapper[4696]: I1202 23:01:00.683919 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.110:9090/-/ready\": dial tcp 10.217.0.110:9090: connect: connection refused" Dec 02 23:01:03 crc kubenswrapper[4696]: I1202 23:01:03.667944 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.058385 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-vfql6"] Dec 02 23:01:04 crc kubenswrapper[4696]: E1202 23:01:04.058852 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955c99b3-ad42-4e65-a391-47eda1c4130a" containerName="swift-ring-rebalance" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.058871 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="955c99b3-ad42-4e65-a391-47eda1c4130a" containerName="swift-ring-rebalance" Dec 02 23:01:04 crc kubenswrapper[4696]: E1202 23:01:04.058891 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1852a5-f61c-4091-a932-8a9f6da96318" containerName="ovn-config" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.058898 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1852a5-f61c-4091-a932-8a9f6da96318" containerName="ovn-config" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.059049 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1852a5-f61c-4091-a932-8a9f6da96318" containerName="ovn-config" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.059065 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="955c99b3-ad42-4e65-a391-47eda1c4130a" containerName="swift-ring-rebalance" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.059704 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.070310 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.070470 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-qkzf2" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.085344 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-vfql6"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.181711 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b4df-account-create-update-hxxpx"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.183140 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.186409 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.196095 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vjwbb"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.199497 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215403 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerID="3091c57e10ca01574704dc4fa3fc645cf3fcfec3d4ac7ce3dac91506c74d67cb" exitCode=0 Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215459 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerID="2b2b519ef6641bf42ed0ee745a4bd559ee5d2b95ebc0d16e2c978cf423dfa65e" exitCode=0 Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215467 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerID="32033a2cdc8344e3cd2044fa00f0ba27b75e3fdbdb681cf0180638376133846b" exitCode=0 Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerDied","Data":"3091c57e10ca01574704dc4fa3fc645cf3fcfec3d4ac7ce3dac91506c74d67cb"} Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215554 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerDied","Data":"2b2b519ef6641bf42ed0ee745a4bd559ee5d2b95ebc0d16e2c978cf423dfa65e"} Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.215565 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerDied","Data":"32033a2cdc8344e3cd2044fa00f0ba27b75e3fdbdb681cf0180638376133846b"} Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.225981 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b4df-account-create-update-hxxpx"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.249582 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vjwbb"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269517 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2scdc\" (UniqueName: \"kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269681 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269731 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269828 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4nv\" (UniqueName: \"kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269848 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269896 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269938 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.269960 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8c79\" (UniqueName: \"kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.304847 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bdhm9"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.306396 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.338902 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bdhm9"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.360156 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5eb9-account-create-update-4prpl"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.361708 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.366223 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.366624 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5eb9-account-create-update-4prpl"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.375904 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.375959 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.375991 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8c79\" (UniqueName: \"kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.376018 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2scdc\" (UniqueName: \"kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.376057 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.376101 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.376120 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4nv\" (UniqueName: \"kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.376142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.377224 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.379183 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.387212 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.407386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2scdc\" (UniqueName: \"kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc\") pod \"cinder-db-create-vjwbb\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.421314 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.426024 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.446223 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4nv\" (UniqueName: \"kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv\") pod \"cinder-b4df-account-create-update-hxxpx\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.452141 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hf7vx"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.455082 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.456338 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8c79\" (UniqueName: \"kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79\") pod \"watcher-db-sync-vfql6\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.458492 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.460600 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.461220 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.462011 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gwxg6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.466018 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hf7vx"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.479386 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.479443 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhsr\" (UniqueName: \"kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.479518 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.479955 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfv9\" (UniqueName: \"kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.511208 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.552617 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d886-account-create-update-6dn5p"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.554618 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.560319 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.563344 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.569246 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-47n98"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.575707 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581323 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhsr\" (UniqueName: \"kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581433 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581464 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnz7\" (UniqueName: \"kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581519 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581568 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfv9\" (UniqueName: \"kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.581642 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.582531 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.583886 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d886-account-create-update-6dn5p"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.584227 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.600293 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-47n98"] Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.627305 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfv9\" (UniqueName: \"kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9\") pod \"barbican-5eb9-account-create-update-4prpl\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.643686 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhsr\" (UniqueName: \"kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr\") pod \"barbican-db-create-bdhm9\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.683934 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684022 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wwv\" (UniqueName: \"kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684060 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpl2\" (UniqueName: \"kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684118 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684151 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.684193 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnz7\" (UniqueName: \"kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.687913 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.689419 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.689957 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.705033 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnz7\" (UniqueName: \"kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7\") pod \"keystone-db-sync-hf7vx\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.785761 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wwv\" (UniqueName: \"kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.786384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpl2\" (UniqueName: \"kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.786520 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.786594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.787502 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.787803 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.805193 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wwv\" (UniqueName: \"kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv\") pod \"neutron-d886-account-create-update-6dn5p\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.808564 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpl2\" (UniqueName: \"kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2\") pod \"neutron-db-create-47n98\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " pod="openstack/neutron-db-create-47n98" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.824777 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.850028 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.880172 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.936942 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:04 crc kubenswrapper[4696]: I1202 23:01:04.987983 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-47n98" Dec 02 23:01:05 crc kubenswrapper[4696]: I1202 23:01:05.682470 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.110:9090/-/ready\": dial tcp 10.217.0.110:9090: connect: connection refused" Dec 02 23:01:07 crc kubenswrapper[4696]: E1202 23:01:07.851650 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 02 23:01:07 crc kubenswrapper[4696]: E1202 23:01:07.852428 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8kqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-lgs5l_openstack(a8842e89-ec2d-4601-9e82-b12c1982a910): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:01:07 crc kubenswrapper[4696]: E1202 23:01:07.853981 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-lgs5l" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" Dec 02 23:01:08 crc kubenswrapper[4696]: E1202 23:01:08.270324 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-lgs5l" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.396159 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468584 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468633 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468660 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468757 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468790 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468827 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4f75\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.468886 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config\") pod \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\" (UID: \"f5b9aee9-4e9a-4d60-be32-f25d230622bc\") " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.470875 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.483695 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.491374 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75" (OuterVolumeSpecName: "kube-api-access-w4f75") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "kube-api-access-w4f75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.493964 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.498409 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config" (OuterVolumeSpecName: "config") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.507867 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out" (OuterVolumeSpecName: "config-out") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.515176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "pvc-ab2c851f-258a-4469-a351-d04930617bdc". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.540791 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config" (OuterVolumeSpecName: "web-config") pod "f5b9aee9-4e9a-4d60-be32-f25d230622bc" (UID: "f5b9aee9-4e9a-4d60-be32-f25d230622bc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571770 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") on node \"crc\" " Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571819 4696 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571848 4696 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5b9aee9-4e9a-4d60-be32-f25d230622bc-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571861 4696 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571872 4696 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571882 4696 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571895 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4f75\" (UniqueName: \"kubernetes.io/projected/f5b9aee9-4e9a-4d60-be32-f25d230622bc-kube-api-access-w4f75\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.571906 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5b9aee9-4e9a-4d60-be32-f25d230622bc-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.595297 4696 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.596152 4696 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ab2c851f-258a-4469-a351-d04930617bdc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc") on node "crc" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.672527 4696 reconciler_common.go:293] "Volume detached for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.694865 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vjwbb"] Dec 02 23:01:08 crc kubenswrapper[4696]: W1202 23:01:08.710357 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4707bfe2_6206_4fac_b146_f95317884325.slice/crio-cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa WatchSource:0}: Error finding container cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa: Status 404 returned error can't find the container with id cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.771930 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5eb9-account-create-update-4prpl"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.783169 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-vfql6"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.789929 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hf7vx"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.915290 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b4df-account-create-update-hxxpx"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.939108 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-47n98"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.955503 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bdhm9"] Dec 02 23:01:08 crc kubenswrapper[4696]: I1202 23:01:08.961733 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d886-account-create-update-6dn5p"] Dec 02 23:01:09 crc kubenswrapper[4696]: W1202 23:01:09.041461 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod709610e7_e445_4b43_9941_7c08653d3278.slice/crio-028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c WatchSource:0}: Error finding container 028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c: Status 404 returned error can't find the container with id 028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.278401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-47n98" event={"ID":"a69316d7-55c1-4852-b509-2b3b995fae3a","Type":"ContainerStarted","Data":"0e04ced7d68e2423b55dfd38cd2b2f0fce7fe411e0f435246ae362efd644781e"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.286557 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b4df-account-create-update-hxxpx" event={"ID":"aa1578e8-97dc-4536-bfda-39825192b676","Type":"ContainerStarted","Data":"0fcb6381d46567171a70697d1ff90cfb661df0b2a41a26adf10105fc72b475b1"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.287874 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vjwbb" event={"ID":"4707bfe2-6206-4fac-b146-f95317884325","Type":"ContainerStarted","Data":"cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.289183 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bdhm9" event={"ID":"bf1ead2b-f60b-444f-96cd-a9d6cbf89919","Type":"ContainerStarted","Data":"32b4dd3d042796d9b21188ef3ea9ba1a742e83adbb23d68a4ebd6b216ecd4671"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.305936 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f5b9aee9-4e9a-4d60-be32-f25d230622bc","Type":"ContainerDied","Data":"ed87a9917924ecfc5f3b8f8bbb6a8494314470b03a6c2a43f1bc5b1b189531e8"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.306010 4696 scope.go:117] "RemoveContainer" containerID="3091c57e10ca01574704dc4fa3fc645cf3fcfec3d4ac7ce3dac91506c74d67cb" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.306189 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.309995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hf7vx" event={"ID":"709610e7-e445-4b43-9941-7c08653d3278","Type":"ContainerStarted","Data":"028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.311769 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d886-account-create-update-6dn5p" event={"ID":"f5e60ece-976f-4c14-a4cc-321e29bd5826","Type":"ContainerStarted","Data":"d49761fc603fd5bea9d809d64eba7605f48003107a6937049cc53fd5dd76659d"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.314090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vfql6" event={"ID":"5fa922b0-7645-420b-bd91-ac4f5040d61b","Type":"ContainerStarted","Data":"7e57001effe048ee15efd4e3ad44b5665ce7fdbdc35c46d6e38bef06c6581daa"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.315407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5eb9-account-create-update-4prpl" event={"ID":"3149957b-27a4-43d6-af45-b585270f4d47","Type":"ContainerStarted","Data":"9ea76ad7310dd3f892f96a43e4c0372843cc7b08d72b2d1f6ec5fc99e3746fde"} Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.353781 4696 scope.go:117] "RemoveContainer" containerID="2b2b519ef6641bf42ed0ee745a4bd559ee5d2b95ebc0d16e2c978cf423dfa65e" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.377982 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.388560 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.414898 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:01:09 crc kubenswrapper[4696]: E1202 23:01:09.415345 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="config-reloader" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415358 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="config-reloader" Dec 02 23:01:09 crc kubenswrapper[4696]: E1202 23:01:09.415373 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="thanos-sidecar" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415381 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="thanos-sidecar" Dec 02 23:01:09 crc kubenswrapper[4696]: E1202 23:01:09.415417 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415424 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" Dec 02 23:01:09 crc kubenswrapper[4696]: E1202 23:01:09.415437 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="init-config-reloader" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415444 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="init-config-reloader" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415648 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="prometheus" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415669 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="thanos-sidecar" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.415685 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" containerName="config-reloader" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.423831 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.426925 4696 scope.go:117] "RemoveContainer" containerID="32033a2cdc8344e3cd2044fa00f0ba27b75e3fdbdb681cf0180638376133846b" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.428240 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.428291 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2dcqn" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.428532 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.429649 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.429906 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.430065 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.468176 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.498452 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b9aee9-4e9a-4d60-be32-f25d230622bc" path="/var/lib/kubelet/pods/f5b9aee9-4e9a-4d60-be32-f25d230622bc/volumes" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.500173 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.568112 4696 scope.go:117] "RemoveContainer" containerID="61faf3799cade00d76f6f556cecd8a6d1e003ea4b965e0f6b5ec2c6dd2210bee" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.589956 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590033 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590131 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnnv\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590223 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590303 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590406 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590457 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590509 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.590675 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.692999 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693108 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693216 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693247 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693292 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnnv\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693324 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693345 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693371 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693393 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.693428 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.694825 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.703341 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.704621 4696 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.704931 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.705522 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.705564 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.705777 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.705841 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5fca820ef489b1541daddc3ea9aef396303f957b46bb94f2636f2ae9edc8d588/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.706802 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.707536 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.713864 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.718025 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnnv\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:09 crc kubenswrapper[4696]: I1202 23:01:09.784456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.085805 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.342028 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5e60ece-976f-4c14-a4cc-321e29bd5826" containerID="5c987e2e45ceb6a881b3f8609fee075f72d0b13986a6ae7e0939be0ea398335e" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.342145 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d886-account-create-update-6dn5p" event={"ID":"f5e60ece-976f-4c14-a4cc-321e29bd5826","Type":"ContainerDied","Data":"5c987e2e45ceb6a881b3f8609fee075f72d0b13986a6ae7e0939be0ea398335e"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.351917 4696 generic.go:334] "Generic (PLEG): container finished" podID="4707bfe2-6206-4fac-b146-f95317884325" containerID="3cd406fc581ece9308df691447996117d1226112d3f2f59cc61fee0f48b02e4c" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.352028 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vjwbb" event={"ID":"4707bfe2-6206-4fac-b146-f95317884325","Type":"ContainerDied","Data":"3cd406fc581ece9308df691447996117d1226112d3f2f59cc61fee0f48b02e4c"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.354308 4696 generic.go:334] "Generic (PLEG): container finished" podID="3149957b-27a4-43d6-af45-b585270f4d47" containerID="ebf5af6ff39872020da9e7c193e0c2efaf73efcd61c553070fe0bc08a934d16b" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.354364 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5eb9-account-create-update-4prpl" event={"ID":"3149957b-27a4-43d6-af45-b585270f4d47","Type":"ContainerDied","Data":"ebf5af6ff39872020da9e7c193e0c2efaf73efcd61c553070fe0bc08a934d16b"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.358693 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"a99468e1b31e853ee56d4cf47607bfcde20a0ff8c79fe7c780e01c5a60e1a151"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.358768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"56a320e2aca846b7a22c99806af0a8b9f6a4070a058dc96af1e16aae08f7d4c9"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.366213 4696 generic.go:334] "Generic (PLEG): container finished" podID="a69316d7-55c1-4852-b509-2b3b995fae3a" containerID="14e2ff1d587d5dc37844edb204090172d212f3c8e987fb3512207512ba097cec" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.366466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-47n98" event={"ID":"a69316d7-55c1-4852-b509-2b3b995fae3a","Type":"ContainerDied","Data":"14e2ff1d587d5dc37844edb204090172d212f3c8e987fb3512207512ba097cec"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.373344 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa1578e8-97dc-4536-bfda-39825192b676" containerID="81299c52c8dd2af04d1d83d48a41374885f052d75ba47c70ff000165f3667faa" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.373455 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b4df-account-create-update-hxxpx" event={"ID":"aa1578e8-97dc-4536-bfda-39825192b676","Type":"ContainerDied","Data":"81299c52c8dd2af04d1d83d48a41374885f052d75ba47c70ff000165f3667faa"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.375377 4696 generic.go:334] "Generic (PLEG): container finished" podID="bf1ead2b-f60b-444f-96cd-a9d6cbf89919" containerID="baf5cc6e3eed3ed647e0c95eb45b61ad1d29ec0d54fd4a87984a96e398a000b2" exitCode=0 Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.375407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bdhm9" event={"ID":"bf1ead2b-f60b-444f-96cd-a9d6cbf89919","Type":"ContainerDied","Data":"baf5cc6e3eed3ed647e0c95eb45b61ad1d29ec0d54fd4a87984a96e398a000b2"} Dec 02 23:01:10 crc kubenswrapper[4696]: I1202 23:01:10.715842 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:01:11 crc kubenswrapper[4696]: I1202 23:01:11.390877 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerStarted","Data":"099099debcd5a303a0f4db0ae5c135cd96e22da621e12a050538b8ee97ebaa9a"} Dec 02 23:01:11 crc kubenswrapper[4696]: I1202 23:01:11.394758 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"8f020237c27e3de00c5e5be071fb18c87515adfa3ae37530b2a4d188155f1c4a"} Dec 02 23:01:11 crc kubenswrapper[4696]: I1202 23:01:11.394878 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"064d785203b60959ab7ad1d37d6574d2b4ad93a329ae41e8bfda5f959a8920af"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.415440 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-47n98" event={"ID":"a69316d7-55c1-4852-b509-2b3b995fae3a","Type":"ContainerDied","Data":"0e04ced7d68e2423b55dfd38cd2b2f0fce7fe411e0f435246ae362efd644781e"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.415850 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e04ced7d68e2423b55dfd38cd2b2f0fce7fe411e0f435246ae362efd644781e" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.418204 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b4df-account-create-update-hxxpx" event={"ID":"aa1578e8-97dc-4536-bfda-39825192b676","Type":"ContainerDied","Data":"0fcb6381d46567171a70697d1ff90cfb661df0b2a41a26adf10105fc72b475b1"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.418259 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fcb6381d46567171a70697d1ff90cfb661df0b2a41a26adf10105fc72b475b1" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.420087 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bdhm9" event={"ID":"bf1ead2b-f60b-444f-96cd-a9d6cbf89919","Type":"ContainerDied","Data":"32b4dd3d042796d9b21188ef3ea9ba1a742e83adbb23d68a4ebd6b216ecd4671"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.420108 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32b4dd3d042796d9b21188ef3ea9ba1a742e83adbb23d68a4ebd6b216ecd4671" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.421987 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d886-account-create-update-6dn5p" event={"ID":"f5e60ece-976f-4c14-a4cc-321e29bd5826","Type":"ContainerDied","Data":"d49761fc603fd5bea9d809d64eba7605f48003107a6937049cc53fd5dd76659d"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.422012 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49761fc603fd5bea9d809d64eba7605f48003107a6937049cc53fd5dd76659d" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.423635 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vjwbb" event={"ID":"4707bfe2-6206-4fac-b146-f95317884325","Type":"ContainerDied","Data":"cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.423656 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf77c5dbbb86bb4339e10ba25e9821158f51c50cbf00d5cf95f805e6cc2a55fa" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.424918 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5eb9-account-create-update-4prpl" event={"ID":"3149957b-27a4-43d6-af45-b585270f4d47","Type":"ContainerDied","Data":"9ea76ad7310dd3f892f96a43e4c0372843cc7b08d72b2d1f6ec5fc99e3746fde"} Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.424935 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea76ad7310dd3f892f96a43e4c0372843cc7b08d72b2d1f6ec5fc99e3746fde" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.782895 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.791360 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.797708 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.808101 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-47n98" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.815033 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.820670 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.886302 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts\") pod \"3149957b-27a4-43d6-af45-b585270f4d47\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.886441 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkfv9\" (UniqueName: \"kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9\") pod \"3149957b-27a4-43d6-af45-b585270f4d47\" (UID: \"3149957b-27a4-43d6-af45-b585270f4d47\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.888505 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3149957b-27a4-43d6-af45-b585270f4d47" (UID: "3149957b-27a4-43d6-af45-b585270f4d47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.895566 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9" (OuterVolumeSpecName: "kube-api-access-dkfv9") pod "3149957b-27a4-43d6-af45-b585270f4d47" (UID: "3149957b-27a4-43d6-af45-b585270f4d47"). InnerVolumeSpecName "kube-api-access-dkfv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.988344 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts\") pod \"aa1578e8-97dc-4536-bfda-39825192b676\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.988453 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2scdc\" (UniqueName: \"kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc\") pod \"4707bfe2-6206-4fac-b146-f95317884325\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989007 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa1578e8-97dc-4536-bfda-39825192b676" (UID: "aa1578e8-97dc-4536-bfda-39825192b676"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989102 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhsr\" (UniqueName: \"kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr\") pod \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989390 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7wwv\" (UniqueName: \"kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv\") pod \"f5e60ece-976f-4c14-a4cc-321e29bd5826\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts\") pod \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\" (UID: \"bf1ead2b-f60b-444f-96cd-a9d6cbf89919\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989662 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k4nv\" (UniqueName: \"kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv\") pod \"aa1578e8-97dc-4536-bfda-39825192b676\" (UID: \"aa1578e8-97dc-4536-bfda-39825192b676\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989733 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts\") pod \"f5e60ece-976f-4c14-a4cc-321e29bd5826\" (UID: \"f5e60ece-976f-4c14-a4cc-321e29bd5826\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989798 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxpl2\" (UniqueName: \"kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2\") pod \"a69316d7-55c1-4852-b509-2b3b995fae3a\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989904 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts\") pod \"a69316d7-55c1-4852-b509-2b3b995fae3a\" (UID: \"a69316d7-55c1-4852-b509-2b3b995fae3a\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.989962 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts\") pod \"4707bfe2-6206-4fac-b146-f95317884325\" (UID: \"4707bfe2-6206-4fac-b146-f95317884325\") " Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.990490 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5e60ece-976f-4c14-a4cc-321e29bd5826" (UID: "f5e60ece-976f-4c14-a4cc-321e29bd5826"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.990778 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf1ead2b-f60b-444f-96cd-a9d6cbf89919" (UID: "bf1ead2b-f60b-444f-96cd-a9d6cbf89919"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.990774 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a69316d7-55c1-4852-b509-2b3b995fae3a" (UID: "a69316d7-55c1-4852-b509-2b3b995fae3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.990919 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4707bfe2-6206-4fac-b146-f95317884325" (UID: "4707bfe2-6206-4fac-b146-f95317884325"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991509 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991621 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkfv9\" (UniqueName: \"kubernetes.io/projected/3149957b-27a4-43d6-af45-b585270f4d47-kube-api-access-dkfv9\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991638 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e60ece-976f-4c14-a4cc-321e29bd5826-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991669 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a69316d7-55c1-4852-b509-2b3b995fae3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991679 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4707bfe2-6206-4fac-b146-f95317884325-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991688 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa1578e8-97dc-4536-bfda-39825192b676-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:12 crc kubenswrapper[4696]: I1202 23:01:12.991701 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3149957b-27a4-43d6-af45-b585270f4d47-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.002624 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc" (OuterVolumeSpecName: "kube-api-access-2scdc") pod "4707bfe2-6206-4fac-b146-f95317884325" (UID: "4707bfe2-6206-4fac-b146-f95317884325"). InnerVolumeSpecName "kube-api-access-2scdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.003612 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr" (OuterVolumeSpecName: "kube-api-access-dxhsr") pod "bf1ead2b-f60b-444f-96cd-a9d6cbf89919" (UID: "bf1ead2b-f60b-444f-96cd-a9d6cbf89919"). InnerVolumeSpecName "kube-api-access-dxhsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.003691 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2" (OuterVolumeSpecName: "kube-api-access-cxpl2") pod "a69316d7-55c1-4852-b509-2b3b995fae3a" (UID: "a69316d7-55c1-4852-b509-2b3b995fae3a"). InnerVolumeSpecName "kube-api-access-cxpl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.003966 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv" (OuterVolumeSpecName: "kube-api-access-9k4nv") pod "aa1578e8-97dc-4536-bfda-39825192b676" (UID: "aa1578e8-97dc-4536-bfda-39825192b676"). InnerVolumeSpecName "kube-api-access-9k4nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.004103 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv" (OuterVolumeSpecName: "kube-api-access-v7wwv") pod "f5e60ece-976f-4c14-a4cc-321e29bd5826" (UID: "f5e60ece-976f-4c14-a4cc-321e29bd5826"). InnerVolumeSpecName "kube-api-access-v7wwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.094662 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k4nv\" (UniqueName: \"kubernetes.io/projected/aa1578e8-97dc-4536-bfda-39825192b676-kube-api-access-9k4nv\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.094728 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxpl2\" (UniqueName: \"kubernetes.io/projected/a69316d7-55c1-4852-b509-2b3b995fae3a-kube-api-access-cxpl2\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.094774 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2scdc\" (UniqueName: \"kubernetes.io/projected/4707bfe2-6206-4fac-b146-f95317884325-kube-api-access-2scdc\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.094794 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxhsr\" (UniqueName: \"kubernetes.io/projected/bf1ead2b-f60b-444f-96cd-a9d6cbf89919-kube-api-access-dxhsr\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.094813 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7wwv\" (UniqueName: \"kubernetes.io/projected/f5e60ece-976f-4c14-a4cc-321e29bd5826-kube-api-access-v7wwv\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439313 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b4df-account-create-update-hxxpx" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439359 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vjwbb" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439406 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5eb9-account-create-update-4prpl" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439402 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d886-account-create-update-6dn5p" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439313 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bdhm9" Dec 02 23:01:13 crc kubenswrapper[4696]: I1202 23:01:13.439514 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-47n98" Dec 02 23:01:14 crc kubenswrapper[4696]: I1202 23:01:14.450487 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerStarted","Data":"5904a25317bb6f8f674fde8baca90ba8678adb0286936a3a753d35d329879616"} Dec 02 23:01:23 crc kubenswrapper[4696]: I1202 23:01:23.568909 4696 generic.go:334] "Generic (PLEG): container finished" podID="f684370c-9731-4837-9e30-675a1f07992d" containerID="5904a25317bb6f8f674fde8baca90ba8678adb0286936a3a753d35d329879616" exitCode=0 Dec 02 23:01:23 crc kubenswrapper[4696]: I1202 23:01:23.568993 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerDied","Data":"5904a25317bb6f8f674fde8baca90ba8678adb0286936a3a753d35d329879616"} Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.415190 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.416086 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvnz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-hf7vx_openstack(709610e7-e445-4b43-9941-7c08653d3278): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.417217 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/keystone-db-sync-hf7vx" podUID="709610e7-e445-4b43-9941-7c08653d3278" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.644089 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="openstack/keystone-db-sync-hf7vx" podUID="709610e7-e445-4b43-9941-7c08653d3278" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.988433 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.45:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.988524 4696 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.45:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.988791 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.45:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8c79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-vfql6_openstack(5fa922b0-7645-420b-bd91-ac4f5040d61b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:01:29 crc kubenswrapper[4696]: E1202 23:01:29.990067 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-vfql6" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" Dec 02 23:01:30 crc kubenswrapper[4696]: E1202 23:01:30.659887 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.45:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-vfql6" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.673290 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lgs5l" event={"ID":"a8842e89-ec2d-4601-9e82-b12c1982a910","Type":"ContainerStarted","Data":"8d2f68f1b9bdf446cd938c5fe7b0e9ec244490eec567e2ea2327b48515be07ba"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.685532 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"6ecb5442d16a641b881789853643bfc80d550577446cec80d5a2149344b2593d"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.685584 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"0c3f84a1bf008015c11c66fe17e5d0fda3aae01c940d342f0684c45ce80b011a"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.686442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"49665679a7e5fd281affe388fe17444ef524bb16b88ab493050ae85f1681fc7a"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.686459 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"1ca435907b76923bf3701f4c19575c434929f4bc4e9c00331c2f1c9284b83dbd"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.686702 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"8fbf04901bdf288c0a66d347d92bd76334a828de967effa9b3e4e06a10615fbd"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.697205 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerStarted","Data":"e9339fa31bcb4e462050530a6dd43d8e802377053cce141770f0f2fbcfb627e3"} Dec 02 23:01:31 crc kubenswrapper[4696]: I1202 23:01:31.697664 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lgs5l" podStartSLOduration=4.201745254 podStartE2EDuration="44.69765458s" podCreationTimestamp="2025-12-02 23:00:47 +0000 UTC" firstStartedPulling="2025-12-02 23:00:49.547943286 +0000 UTC m=+1112.428623287" lastFinishedPulling="2025-12-02 23:01:30.043852612 +0000 UTC m=+1152.924532613" observedRunningTime="2025-12-02 23:01:31.695057066 +0000 UTC m=+1154.575737067" watchObservedRunningTime="2025-12-02 23:01:31.69765458 +0000 UTC m=+1154.578334581" Dec 02 23:01:32 crc kubenswrapper[4696]: I1202 23:01:32.717345 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"4a0d0427267cb017ddaf69612ecbf6558ab6bff4ae66d82205aab3b2e89bd7ee"} Dec 02 23:01:32 crc kubenswrapper[4696]: I1202 23:01:32.717925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d5594f21-8f1d-4105-ad47-c065a9fc468b","Type":"ContainerStarted","Data":"dcdfb42aa69684c024f694bfd3638969de312b33f517d7913c073855a1d2baa6"} Dec 02 23:01:32 crc kubenswrapper[4696]: I1202 23:01:32.769380 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.74836586 podStartE2EDuration="1m13.769355566s" podCreationTimestamp="2025-12-02 23:00:19 +0000 UTC" firstStartedPulling="2025-12-02 23:00:52.945764909 +0000 UTC m=+1115.826444910" lastFinishedPulling="2025-12-02 23:01:29.966754615 +0000 UTC m=+1152.847434616" observedRunningTime="2025-12-02 23:01:32.76628713 +0000 UTC m=+1155.646967131" watchObservedRunningTime="2025-12-02 23:01:32.769355566 +0000 UTC m=+1155.650035577" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.105702 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.106557 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4707bfe2-6206-4fac-b146-f95317884325" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.106573 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4707bfe2-6206-4fac-b146-f95317884325" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.106596 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1578e8-97dc-4536-bfda-39825192b676" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.106610 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1578e8-97dc-4536-bfda-39825192b676" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.106632 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e60ece-976f-4c14-a4cc-321e29bd5826" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.106639 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e60ece-976f-4c14-a4cc-321e29bd5826" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.106828 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69316d7-55c1-4852-b509-2b3b995fae3a" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.106919 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69316d7-55c1-4852-b509-2b3b995fae3a" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.107009 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1ead2b-f60b-444f-96cd-a9d6cbf89919" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.107020 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1ead2b-f60b-444f-96cd-a9d6cbf89919" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: E1202 23:01:33.107049 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3149957b-27a4-43d6-af45-b585270f4d47" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.107056 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3149957b-27a4-43d6-af45-b585270f4d47" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.107962 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1578e8-97dc-4536-bfda-39825192b676" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.107996 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3149957b-27a4-43d6-af45-b585270f4d47" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.108006 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e60ece-976f-4c14-a4cc-321e29bd5826" containerName="mariadb-account-create-update" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.108029 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1ead2b-f60b-444f-96cd-a9d6cbf89919" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.108052 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4707bfe2-6206-4fac-b146-f95317884325" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.108072 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69316d7-55c1-4852-b509-2b3b995fae3a" containerName="mariadb-database-create" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.111691 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.120983 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.121619 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.244913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.244967 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns2hc\" (UniqueName: \"kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.245070 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.245234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.245358 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.245646 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.347702 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.347819 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns2hc\" (UniqueName: \"kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.347876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.347909 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.347973 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.348974 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.349538 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.349596 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.349824 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.350003 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.350090 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.374099 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns2hc\" (UniqueName: \"kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc\") pod \"dnsmasq-dns-5c79d794d7-fvvbx\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:33 crc kubenswrapper[4696]: I1202 23:01:33.440374 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:34 crc kubenswrapper[4696]: I1202 23:01:34.019443 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:01:34 crc kubenswrapper[4696]: I1202 23:01:34.763347 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerStarted","Data":"ddfc235b5bb0772d57e16495a8462d5cf42ca4182a8f3779f03d2787ed733fac"} Dec 02 23:01:34 crc kubenswrapper[4696]: I1202 23:01:34.766406 4696 generic.go:334] "Generic (PLEG): container finished" podID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerID="76cee049afce7aac085cabe9831acad5a5772a29908ae685c8c6ad1a4c0da687" exitCode=0 Dec 02 23:01:34 crc kubenswrapper[4696]: I1202 23:01:34.766478 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" event={"ID":"236e2c20-fc39-4643-904a-ab015e8c73ec","Type":"ContainerDied","Data":"76cee049afce7aac085cabe9831acad5a5772a29908ae685c8c6ad1a4c0da687"} Dec 02 23:01:34 crc kubenswrapper[4696]: I1202 23:01:34.766521 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" event={"ID":"236e2c20-fc39-4643-904a-ab015e8c73ec","Type":"ContainerStarted","Data":"39dcfa4dc767f0c9bfe664fdcfbd394ebddaf516375eff4eb2cee161340d9ba1"} Dec 02 23:01:35 crc kubenswrapper[4696]: I1202 23:01:35.781552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" event={"ID":"236e2c20-fc39-4643-904a-ab015e8c73ec","Type":"ContainerStarted","Data":"531befe7b94f2cfd73bc2f342146e69aa7c85e6670fb90f57a19ffae844ed69e"} Dec 02 23:01:35 crc kubenswrapper[4696]: I1202 23:01:35.782688 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:35 crc kubenswrapper[4696]: I1202 23:01:35.785624 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerStarted","Data":"75db3f0dd5722131cb38a869bf56e7046b5407824c01119b9a156cd34d3b0781"} Dec 02 23:01:35 crc kubenswrapper[4696]: I1202 23:01:35.835926 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podStartSLOduration=2.8358856230000002 podStartE2EDuration="2.835885623s" podCreationTimestamp="2025-12-02 23:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:35.826361134 +0000 UTC m=+1158.707041165" watchObservedRunningTime="2025-12-02 23:01:35.835885623 +0000 UTC m=+1158.716565714" Dec 02 23:01:35 crc kubenswrapper[4696]: I1202 23:01:35.892858 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=26.892827761 podStartE2EDuration="26.892827761s" podCreationTimestamp="2025-12-02 23:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:35.882951982 +0000 UTC m=+1158.763631993" watchObservedRunningTime="2025-12-02 23:01:35.892827761 +0000 UTC m=+1158.773507812" Dec 02 23:01:40 crc kubenswrapper[4696]: I1202 23:01:40.086388 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:40 crc kubenswrapper[4696]: I1202 23:01:40.087425 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:40 crc kubenswrapper[4696]: I1202 23:01:40.096798 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:40 crc kubenswrapper[4696]: I1202 23:01:40.853724 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:01:42 crc kubenswrapper[4696]: I1202 23:01:42.872707 4696 generic.go:334] "Generic (PLEG): container finished" podID="a8842e89-ec2d-4601-9e82-b12c1982a910" containerID="8d2f68f1b9bdf446cd938c5fe7b0e9ec244490eec567e2ea2327b48515be07ba" exitCode=0 Dec 02 23:01:42 crc kubenswrapper[4696]: I1202 23:01:42.872792 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lgs5l" event={"ID":"a8842e89-ec2d-4601-9e82-b12c1982a910","Type":"ContainerDied","Data":"8d2f68f1b9bdf446cd938c5fe7b0e9ec244490eec567e2ea2327b48515be07ba"} Dec 02 23:01:43 crc kubenswrapper[4696]: I1202 23:01:43.462392 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:01:43 crc kubenswrapper[4696]: I1202 23:01:43.619461 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:01:43 crc kubenswrapper[4696]: I1202 23:01:43.620191 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="dnsmasq-dns" containerID="cri-o://2a868d1938f300e9a97dcd24fbcbe189967427c40cfe01e1fd66f5ba231411f1" gracePeriod=10 Dec 02 23:01:43 crc kubenswrapper[4696]: I1202 23:01:43.890194 4696 generic.go:334] "Generic (PLEG): container finished" podID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerID="2a868d1938f300e9a97dcd24fbcbe189967427c40cfe01e1fd66f5ba231411f1" exitCode=0 Dec 02 23:01:43 crc kubenswrapper[4696]: I1202 23:01:43.890372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" event={"ID":"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb","Type":"ContainerDied","Data":"2a868d1938f300e9a97dcd24fbcbe189967427c40cfe01e1fd66f5ba231411f1"} Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.032959 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.224975 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb\") pod \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.225105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config\") pod \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.225125 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc\") pod \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.225224 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb\") pod \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.225295 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89rcf\" (UniqueName: \"kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf\") pod \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\" (UID: \"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.229280 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf" (OuterVolumeSpecName: "kube-api-access-89rcf") pod "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" (UID: "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb"). InnerVolumeSpecName "kube-api-access-89rcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.254242 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lgs5l" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.274733 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config" (OuterVolumeSpecName: "config") pod "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" (UID: "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.298361 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" (UID: "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.303830 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" (UID: "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.309697 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" (UID: "0f2069fd-8da0-44f4-8e80-38f2ed4f5afb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.327627 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.327672 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.327682 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.327698 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.327709 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89rcf\" (UniqueName: \"kubernetes.io/projected/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb-kube-api-access-89rcf\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.428878 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle\") pod \"a8842e89-ec2d-4601-9e82-b12c1982a910\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.429012 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data\") pod \"a8842e89-ec2d-4601-9e82-b12c1982a910\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.429128 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kqh\" (UniqueName: \"kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh\") pod \"a8842e89-ec2d-4601-9e82-b12c1982a910\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.429180 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data\") pod \"a8842e89-ec2d-4601-9e82-b12c1982a910\" (UID: \"a8842e89-ec2d-4601-9e82-b12c1982a910\") " Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.433927 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a8842e89-ec2d-4601-9e82-b12c1982a910" (UID: "a8842e89-ec2d-4601-9e82-b12c1982a910"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.436534 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh" (OuterVolumeSpecName: "kube-api-access-x8kqh") pod "a8842e89-ec2d-4601-9e82-b12c1982a910" (UID: "a8842e89-ec2d-4601-9e82-b12c1982a910"). InnerVolumeSpecName "kube-api-access-x8kqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.452009 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8842e89-ec2d-4601-9e82-b12c1982a910" (UID: "a8842e89-ec2d-4601-9e82-b12c1982a910"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.495263 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data" (OuterVolumeSpecName: "config-data") pod "a8842e89-ec2d-4601-9e82-b12c1982a910" (UID: "a8842e89-ec2d-4601-9e82-b12c1982a910"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.531940 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.532275 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kqh\" (UniqueName: \"kubernetes.io/projected/a8842e89-ec2d-4601-9e82-b12c1982a910-kube-api-access-x8kqh\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.532340 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.532397 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8842e89-ec2d-4601-9e82-b12c1982a910-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.905699 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" event={"ID":"0f2069fd-8da0-44f4-8e80-38f2ed4f5afb","Type":"ContainerDied","Data":"9cc015df0fbd428e2c75bee1066f29908ef74641fdf8b33fc3a469a50346e175"} Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.905806 4696 scope.go:117] "RemoveContainer" containerID="2a868d1938f300e9a97dcd24fbcbe189967427c40cfe01e1fd66f5ba231411f1" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.906245 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-btlkn" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.918956 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lgs5l" event={"ID":"a8842e89-ec2d-4601-9e82-b12c1982a910","Type":"ContainerDied","Data":"6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0"} Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.919305 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c23d3b47daf7ec795b54f6a5e5bbec03c41d52527da25281baf26696a1b07c0" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.919559 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lgs5l" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.927308 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hf7vx" event={"ID":"709610e7-e445-4b43-9941-7c08653d3278","Type":"ContainerStarted","Data":"e1686d19d55aa5c80b38161546b19b9bf89beb34c52c6a76f3fec22e31517d9c"} Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.931065 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vfql6" event={"ID":"5fa922b0-7645-420b-bd91-ac4f5040d61b","Type":"ContainerStarted","Data":"2d7b553213b5397bc12195eef6f7f892e57a46158e9a38414a93fcc7866cb617"} Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.943008 4696 scope.go:117] "RemoveContainer" containerID="78dc9e001fa155c0aeda1f4103ed0335cc2c9c14afdd03437d05ffe6b31edbb3" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.990797 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hf7vx" podStartSLOduration=6.182289528 podStartE2EDuration="40.990772636s" podCreationTimestamp="2025-12-02 23:01:04 +0000 UTC" firstStartedPulling="2025-12-02 23:01:09.087905089 +0000 UTC m=+1131.968585090" lastFinishedPulling="2025-12-02 23:01:43.896388197 +0000 UTC m=+1166.777068198" observedRunningTime="2025-12-02 23:01:44.955684265 +0000 UTC m=+1167.836364296" watchObservedRunningTime="2025-12-02 23:01:44.990772636 +0000 UTC m=+1167.871452647" Dec 02 23:01:44 crc kubenswrapper[4696]: I1202 23:01:44.999411 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-vfql6" podStartSLOduration=6.503579952 podStartE2EDuration="40.999389839s" podCreationTimestamp="2025-12-02 23:01:04 +0000 UTC" firstStartedPulling="2025-12-02 23:01:09.088836036 +0000 UTC m=+1131.969516037" lastFinishedPulling="2025-12-02 23:01:43.584645913 +0000 UTC m=+1166.465325924" observedRunningTime="2025-12-02 23:01:44.980352901 +0000 UTC m=+1167.861032922" watchObservedRunningTime="2025-12-02 23:01:44.999389839 +0000 UTC m=+1167.880069860" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.012109 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.019637 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-btlkn"] Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.261516 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:45 crc kubenswrapper[4696]: E1202 23:01:45.261928 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="init" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.261945 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="init" Dec 02 23:01:45 crc kubenswrapper[4696]: E1202 23:01:45.261964 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="dnsmasq-dns" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.261971 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="dnsmasq-dns" Dec 02 23:01:45 crc kubenswrapper[4696]: E1202 23:01:45.262004 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" containerName="glance-db-sync" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.262010 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" containerName="glance-db-sync" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.262173 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" containerName="glance-db-sync" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.262192 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" containerName="dnsmasq-dns" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.268086 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.298812 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.443538 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2069fd-8da0-44f4-8e80-38f2ed4f5afb" path="/var/lib/kubelet/pods/0f2069fd-8da0-44f4-8e80-38f2ed4f5afb/volumes" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.457423 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.457938 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.458005 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.458034 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.458064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrb6\" (UniqueName: \"kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.458141 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560266 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560300 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560328 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560354 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrb6\" (UniqueName: \"kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.560395 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.561604 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.562349 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.563450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.563905 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.565176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.592579 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrb6\" (UniqueName: \"kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6\") pod \"dnsmasq-dns-5f59b8f679-ztpxs\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:45 crc kubenswrapper[4696]: I1202 23:01:45.634088 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:46 crc kubenswrapper[4696]: W1202 23:01:46.149621 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode066271b_ef9d_4c8d_8c06_d9f8a8afbac6.slice/crio-8a007fac2ab82ce825a348489de9241178ab577bb6cede98111c9c9c973e470a WatchSource:0}: Error finding container 8a007fac2ab82ce825a348489de9241178ab577bb6cede98111c9c9c973e470a: Status 404 returned error can't find the container with id 8a007fac2ab82ce825a348489de9241178ab577bb6cede98111c9c9c973e470a Dec 02 23:01:46 crc kubenswrapper[4696]: I1202 23:01:46.149889 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:46 crc kubenswrapper[4696]: I1202 23:01:46.960290 4696 generic.go:334] "Generic (PLEG): container finished" podID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerID="2a1870453968f8b336f76db5eaf0a922811514c5f8000078118bec78caedfcfb" exitCode=0 Dec 02 23:01:46 crc kubenswrapper[4696]: I1202 23:01:46.960407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" event={"ID":"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6","Type":"ContainerDied","Data":"2a1870453968f8b336f76db5eaf0a922811514c5f8000078118bec78caedfcfb"} Dec 02 23:01:46 crc kubenswrapper[4696]: I1202 23:01:46.961553 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" event={"ID":"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6","Type":"ContainerStarted","Data":"8a007fac2ab82ce825a348489de9241178ab577bb6cede98111c9c9c973e470a"} Dec 02 23:01:47 crc kubenswrapper[4696]: I1202 23:01:47.972505 4696 generic.go:334] "Generic (PLEG): container finished" podID="5fa922b0-7645-420b-bd91-ac4f5040d61b" containerID="2d7b553213b5397bc12195eef6f7f892e57a46158e9a38414a93fcc7866cb617" exitCode=0 Dec 02 23:01:47 crc kubenswrapper[4696]: I1202 23:01:47.972593 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vfql6" event={"ID":"5fa922b0-7645-420b-bd91-ac4f5040d61b","Type":"ContainerDied","Data":"2d7b553213b5397bc12195eef6f7f892e57a46158e9a38414a93fcc7866cb617"} Dec 02 23:01:47 crc kubenswrapper[4696]: I1202 23:01:47.975454 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" event={"ID":"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6","Type":"ContainerStarted","Data":"f452f94505068f4f8a9f7dbc41f090285aadfd4184999dbbcea356149c2ed410"} Dec 02 23:01:47 crc kubenswrapper[4696]: I1202 23:01:47.975621 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:48 crc kubenswrapper[4696]: I1202 23:01:48.024902 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" podStartSLOduration=3.024875168 podStartE2EDuration="3.024875168s" podCreationTimestamp="2025-12-02 23:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:48.013852475 +0000 UTC m=+1170.894532476" watchObservedRunningTime="2025-12-02 23:01:48.024875168 +0000 UTC m=+1170.905555159" Dec 02 23:01:48 crc kubenswrapper[4696]: I1202 23:01:48.986350 4696 generic.go:334] "Generic (PLEG): container finished" podID="709610e7-e445-4b43-9941-7c08653d3278" containerID="e1686d19d55aa5c80b38161546b19b9bf89beb34c52c6a76f3fec22e31517d9c" exitCode=0 Dec 02 23:01:48 crc kubenswrapper[4696]: I1202 23:01:48.986684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hf7vx" event={"ID":"709610e7-e445-4b43-9941-7c08653d3278","Type":"ContainerDied","Data":"e1686d19d55aa5c80b38161546b19b9bf89beb34c52c6a76f3fec22e31517d9c"} Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.363945 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.547821 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8c79\" (UniqueName: \"kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79\") pod \"5fa922b0-7645-420b-bd91-ac4f5040d61b\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.547968 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data\") pod \"5fa922b0-7645-420b-bd91-ac4f5040d61b\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.548349 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data\") pod \"5fa922b0-7645-420b-bd91-ac4f5040d61b\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.548561 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle\") pod \"5fa922b0-7645-420b-bd91-ac4f5040d61b\" (UID: \"5fa922b0-7645-420b-bd91-ac4f5040d61b\") " Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.555798 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79" (OuterVolumeSpecName: "kube-api-access-r8c79") pod "5fa922b0-7645-420b-bd91-ac4f5040d61b" (UID: "5fa922b0-7645-420b-bd91-ac4f5040d61b"). InnerVolumeSpecName "kube-api-access-r8c79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.556369 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5fa922b0-7645-420b-bd91-ac4f5040d61b" (UID: "5fa922b0-7645-420b-bd91-ac4f5040d61b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.594327 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fa922b0-7645-420b-bd91-ac4f5040d61b" (UID: "5fa922b0-7645-420b-bd91-ac4f5040d61b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.627127 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data" (OuterVolumeSpecName: "config-data") pod "5fa922b0-7645-420b-bd91-ac4f5040d61b" (UID: "5fa922b0-7645-420b-bd91-ac4f5040d61b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.651425 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.651459 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8c79\" (UniqueName: \"kubernetes.io/projected/5fa922b0-7645-420b-bd91-ac4f5040d61b-kube-api-access-r8c79\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.651476 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.651486 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fa922b0-7645-420b-bd91-ac4f5040d61b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.997249 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vfql6" event={"ID":"5fa922b0-7645-420b-bd91-ac4f5040d61b","Type":"ContainerDied","Data":"7e57001effe048ee15efd4e3ad44b5665ce7fdbdc35c46d6e38bef06c6581daa"} Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.997845 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e57001effe048ee15efd4e3ad44b5665ce7fdbdc35c46d6e38bef06c6581daa" Dec 02 23:01:49 crc kubenswrapper[4696]: I1202 23:01:49.997363 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vfql6" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.367961 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.566413 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data\") pod \"709610e7-e445-4b43-9941-7c08653d3278\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.566481 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle\") pod \"709610e7-e445-4b43-9941-7c08653d3278\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.566519 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvnz7\" (UniqueName: \"kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7\") pod \"709610e7-e445-4b43-9941-7c08653d3278\" (UID: \"709610e7-e445-4b43-9941-7c08653d3278\") " Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.572919 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7" (OuterVolumeSpecName: "kube-api-access-hvnz7") pod "709610e7-e445-4b43-9941-7c08653d3278" (UID: "709610e7-e445-4b43-9941-7c08653d3278"). InnerVolumeSpecName "kube-api-access-hvnz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.595162 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "709610e7-e445-4b43-9941-7c08653d3278" (UID: "709610e7-e445-4b43-9941-7c08653d3278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.622954 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data" (OuterVolumeSpecName: "config-data") pod "709610e7-e445-4b43-9941-7c08653d3278" (UID: "709610e7-e445-4b43-9941-7c08653d3278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.669575 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.669621 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709610e7-e445-4b43-9941-7c08653d3278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:50 crc kubenswrapper[4696]: I1202 23:01:50.669636 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvnz7\" (UniqueName: \"kubernetes.io/projected/709610e7-e445-4b43-9941-7c08653d3278-kube-api-access-hvnz7\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.008474 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hf7vx" event={"ID":"709610e7-e445-4b43-9941-7c08653d3278","Type":"ContainerDied","Data":"028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c"} Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.008515 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="028c9375f5cc92ab86a14972edc81c81f838e00f7f21ca3cd30d5cf2b14df27c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.008564 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hf7vx" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.294724 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.295076 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="dnsmasq-dns" containerID="cri-o://f452f94505068f4f8a9f7dbc41f090285aadfd4184999dbbcea356149c2ed410" gracePeriod=10 Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.328312 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l6c4c"] Dec 02 23:01:51 crc kubenswrapper[4696]: E1202 23:01:51.328835 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" containerName="watcher-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.328855 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" containerName="watcher-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: E1202 23:01:51.328903 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709610e7-e445-4b43-9941-7c08653d3278" containerName="keystone-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.328912 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="709610e7-e445-4b43-9941-7c08653d3278" containerName="keystone-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.329086 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" containerName="watcher-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.329121 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="709610e7-e445-4b43-9941-7c08653d3278" containerName="keystone-db-sync" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.329873 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.335423 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gwxg6" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.335685 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.336125 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.336303 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.343722 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.362993 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.364670 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383613 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383669 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383696 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvkfc\" (UniqueName: \"kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383804 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383847 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383873 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383890 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383959 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxhn\" (UniqueName: \"kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.383983 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.384628 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l6c4c"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.420718 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.480415 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.481686 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495186 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495268 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495359 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495484 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495534 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495578 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495675 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495839 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwfzh\" (UniqueName: \"kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495924 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mxhn\" (UniqueName: \"kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495966 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.495998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.496038 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.496096 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.496143 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkfc\" (UniqueName: \"kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.496232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.497024 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-qkzf2" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.497159 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.505398 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.513728 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.516407 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.523921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.529326 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.530219 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.535399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.568439 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.569128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.576168 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.600311 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.601935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.613063 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.613178 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.613437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwfzh\" (UniqueName: \"kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.613516 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.614216 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.614712 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.629678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.630311 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.631782 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mxhn\" (UniqueName: \"kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn\") pod \"dnsmasq-dns-bbf5cc879-982dr\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.637869 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.649195 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.668786 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkfc\" (UniqueName: \"kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc\") pod \"keystone-bootstrap-l6c4c\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.670269 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.681500 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.681928 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-p9gxv" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.682784 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.731045 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.751577 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwfzh\" (UniqueName: \"kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh\") pod \"watcher-decision-engine-0\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.765782 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.778193 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.779720 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.779801 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.779870 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.779904 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.779934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdwjp\" (UniqueName: \"kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.791075 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.792506 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.808323 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.832003 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.833915 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.861440 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.863615 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.872860 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883203 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883243 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzqlf\" (UniqueName: \"kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883287 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883361 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883380 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883407 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883423 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.883465 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdwjp\" (UniqueName: \"kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.889037 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.889569 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.891471 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.892046 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.914863 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.920584 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.942427 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.960057 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.960465 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.980633 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdwjp\" (UniqueName: \"kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp\") pod \"horizon-b5c6d7897-zxfvr\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985055 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985106 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985126 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985146 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985163 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985195 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985211 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985236 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488km\" (UniqueName: \"kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.985372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzqlf\" (UniqueName: \"kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.986313 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b2zfs"] Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.989061 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.993353 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.997044 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:51 crc kubenswrapper[4696]: I1202 23:01:51.997829 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.000811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.014983 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ml2m5" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.018210 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.018545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.037824 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.042863 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzqlf\" (UniqueName: \"kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf\") pod \"watcher-api-0\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " pod="openstack/watcher-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.053303 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090253 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090284 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090338 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090395 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090410 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090455 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090478 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090495 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090512 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090535 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488km\" (UniqueName: \"kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5nlg\" (UniqueName: \"kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090617 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rjl5\" (UniqueName: \"kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.090636 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.117060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.117142 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.118613 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.118678 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b2zfs"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.163772 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488km\" (UniqueName: \"kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km\") pod \"watcher-applier-0\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.200123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5nlg\" (UniqueName: \"kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.201033 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.201088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rjl5\" (UniqueName: \"kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.201111 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.201137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204278 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204320 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204338 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204442 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204467 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.204595 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.205423 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.202518 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.210908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.213573 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hnh2k"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.233110 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.215143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.222502 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.226235 4696 generic.go:334] "Generic (PLEG): container finished" podID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerID="f452f94505068f4f8a9f7dbc41f090285aadfd4184999dbbcea356149c2ed410" exitCode=0 Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.227086 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.230990 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rjl5\" (UniqueName: \"kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.214228 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.234955 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.238700 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.239024 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.239812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" event={"ID":"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6","Type":"ContainerDied","Data":"f452f94505068f4f8a9f7dbc41f090285aadfd4184999dbbcea356149c2ed410"} Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.240385 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9cwtk"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.240357 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.242670 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts\") pod \"ceilometer-0\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.244545 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2w9bc" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.244876 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.246039 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.249575 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ft4bp" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.249842 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.249972 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.253867 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.259213 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5nlg\" (UniqueName: \"kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg\") pod \"cinder-db-sync-b2zfs\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.268142 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hnh2k"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.270103 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.273183 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.276922 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9cwtk"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.289988 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.324783 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.348835 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.350643 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.369001 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.388704 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.409377 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.422714 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqx4h\" (UniqueName: \"kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.422842 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tll\" (UniqueName: \"kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.422886 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.422911 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.422965 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.423004 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvz5\" (UniqueName: \"kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.423123 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.448008 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.448191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.448227 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.448377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.464075 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.472336 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fnjtl"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.472939 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.533971 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.534249 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.534420 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.535549 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bvbjt" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.543909 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.548951 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.549133 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.549312 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f9524" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583065 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583133 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583171 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknpj\" (UniqueName: \"kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583227 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t7t\" (UniqueName: \"kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583302 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583342 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583379 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583411 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqx4h\" (UniqueName: \"kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583458 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583483 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583512 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583542 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26tll\" (UniqueName: \"kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583571 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583591 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583644 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583675 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvz5\" (UniqueName: \"kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583725 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.583768 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.678690 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.687981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.692507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.692846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.715318 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723643 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723703 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723732 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723791 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723845 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723881 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.723947 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9cz\" (UniqueName: \"kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724048 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724154 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724224 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724258 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknpj\" (UniqueName: \"kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724293 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95t7t\" (UniqueName: \"kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724315 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724363 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724392 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724410 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.724452 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.744669 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.748268 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.748584 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.758682 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.759498 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.762220 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.765169 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.769504 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fnjtl"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.769770 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.770021 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqx4h\" (UniqueName: \"kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h\") pod \"horizon-5f7898bd8f-xjlfq\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.771225 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.773909 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.775060 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.775715 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.776238 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tll\" (UniqueName: \"kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll\") pod \"barbican-db-sync-hnh2k\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.783848 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvz5\" (UniqueName: \"kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5\") pod \"neutron-db-sync-9cwtk\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.792107 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.802407 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t7t\" (UniqueName: \"kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t\") pod \"dnsmasq-dns-56df8fb6b7-qdkb4\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.806525 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknpj\" (UniqueName: \"kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj\") pod \"placement-db-sync-fnjtl\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.830180 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.833370 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.834870 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.834944 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.835031 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9cz\" (UniqueName: \"kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.835125 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.848639 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.848849 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.848907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.848975 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:52 crc kubenswrapper[4696]: I1202 23:01:52.843267 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.876842 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.877753 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.841721 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.877800 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.878359 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.879289 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.879475 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.887532 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.932975 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9cz\" (UniqueName: \"kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.985393 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fnjtl" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:52.987308 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.002695 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.008483 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.014423 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.014529 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.015313 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.024409 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.059826 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.059869 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.059903 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.059987 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060048 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060168 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vrb6\" (UniqueName: \"kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6\") pod \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\" (UID: \"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6\") " Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060452 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060505 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060526 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060563 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060631 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060650 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.060712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqgt\" (UniqueName: \"kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.086343 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6" (OuterVolumeSpecName: "kube-api-access-9vrb6") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "kube-api-access-9vrb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.168865 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqgt\" (UniqueName: \"kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172099 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172185 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172217 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172290 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172354 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172443 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.172529 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vrb6\" (UniqueName: \"kubernetes.io/projected/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-kube-api-access-9vrb6\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.173755 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.174092 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.174795 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.182223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.202511 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.204572 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.211339 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.211773 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.214807 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.215262 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.215589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.235854 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqgt\" (UniqueName: \"kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.237047 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config" (OuterVolumeSpecName: "config") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.237492 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.252476 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" event={"ID":"e066271b-ef9d-4c8d-8c06-d9f8a8afbac6","Type":"ContainerDied","Data":"8a007fac2ab82ce825a348489de9241178ab577bb6cede98111c9c9c973e470a"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.252536 4696 scope.go:117] "RemoveContainer" containerID="f452f94505068f4f8a9f7dbc41f090285aadfd4184999dbbcea356149c2ed410" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.252684 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztpxs" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298389 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" (UID: "e066271b-ef9d-4c8d-8c06-d9f8a8afbac6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298548 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298580 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298592 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298602 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.298612 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.348530 4696 scope.go:117] "RemoveContainer" containerID="2a1870453968f8b336f76db5eaf0a922811514c5f8000078118bec78caedfcfb" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.414974 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.660968 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.669097 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.685652 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztpxs"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.695428 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: W1202 23:01:53.697912 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode07c8b14_2cf9_4a6c_8fb4_d9f362d85f49.slice/crio-5503afd7a43a7137567621c4682cb5bad013692827c71c08c963230f187b68e2 WatchSource:0}: Error finding container 5503afd7a43a7137567621c4682cb5bad013692827c71c08c963230f187b68e2: Status 404 returned error can't find the container with id 5503afd7a43a7137567621c4682cb5bad013692827c71c08c963230f187b68e2 Dec 02 23:01:54 crc kubenswrapper[4696]: W1202 23:01:53.698132 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f93d253_e832_436c_8839_1dbb94d58f86.slice/crio-21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7 WatchSource:0}: Error finding container 21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7: Status 404 returned error can't find the container with id 21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7 Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.704440 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l6c4c"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:53.713271 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.271187 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49","Type":"ContainerStarted","Data":"5503afd7a43a7137567621c4682cb5bad013692827c71c08c963230f187b68e2"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.274921 4696 generic.go:334] "Generic (PLEG): container finished" podID="4c640324-4b47-498d-ae9c-8b66a5de9618" containerID="d597224c6c6f4dcf40e9831f9f65e7e5de88deea6dd39deee998be4ba86558f8" exitCode=0 Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.275052 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" event={"ID":"4c640324-4b47-498d-ae9c-8b66a5de9618","Type":"ContainerDied","Data":"d597224c6c6f4dcf40e9831f9f65e7e5de88deea6dd39deee998be4ba86558f8"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.275103 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" event={"ID":"4c640324-4b47-498d-ae9c-8b66a5de9618","Type":"ContainerStarted","Data":"4efbb34bb4fc1c901eefe33dbd71317c3d16beae6c435940c2cd4dbf56e69f84"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.280130 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6c4c" event={"ID":"7f93d253-e832-436c-8839-1dbb94d58f86","Type":"ContainerStarted","Data":"abaf4949223eb7ae80e1cea1e57969b340c4095d0c83f03d0effb1be18e79890"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.280189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6c4c" event={"ID":"7f93d253-e832-436c-8839-1dbb94d58f86","Type":"ContainerStarted","Data":"21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.286792 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerStarted","Data":"ec475b8931aca6fad9d336a1a17848f524e7b307de88e88447bb912d47dae6a6"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.286857 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerStarted","Data":"cb99dc7d4eee6487e752956c40ce80efcccce3b44ae74fe98c67ceb24d4314cb"} Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.376297 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l6c4c" podStartSLOduration=3.376268308 podStartE2EDuration="3.376268308s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:54.3245337 +0000 UTC m=+1177.205213711" watchObservedRunningTime="2025-12-02 23:01:54.376268308 +0000 UTC m=+1177.256948309" Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.436397 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.462570 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.688899 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.719597 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b2zfs"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.727807 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hnh2k"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.737661 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.746943 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9cwtk"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.959177 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fnjtl"] Dec 02 23:01:54 crc kubenswrapper[4696]: I1202 23:01:54.968438 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.203506 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.259697 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.282610 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.330212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerStarted","Data":"e0b4da2a357e7295795cdc78badc4aa2f484e96f7840899de5c9939518039049"} Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.330953 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.330987 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.352977 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.410827 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.410807882 podStartE2EDuration="4.410807882s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:55.402618319 +0000 UTC m=+1178.283298320" watchObservedRunningTime="2025-12-02 23:01:55.410807882 +0000 UTC m=+1178.291487883" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.461086 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" path="/var/lib/kubelet/pods/e066271b-ef9d-4c8d-8c06-d9f8a8afbac6/volumes" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.461751 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.475176 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:01:55 crc kubenswrapper[4696]: E1202 23:01:55.476579 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="dnsmasq-dns" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.476607 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="dnsmasq-dns" Dec 02 23:01:55 crc kubenswrapper[4696]: E1202 23:01:55.476648 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="init" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.476655 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="init" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.476868 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e066271b-ef9d-4c8d-8c06-d9f8a8afbac6" containerName="dnsmasq-dns" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.478031 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.517812 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.573118 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.573200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.573244 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.573327 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.573409 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8px\" (UniqueName: \"kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.682059 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.682168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.682245 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8px\" (UniqueName: \"kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.682266 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.682294 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.683006 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.683932 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.684166 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.698872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.757415 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8px\" (UniqueName: \"kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px\") pod \"horizon-5bcb6c6d6f-rv22m\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:55 crc kubenswrapper[4696]: W1202 23:01:55.765316 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda47e8eff_6d2e_4140_9513_bd2e3ff6a153.slice/crio-591829cd75a7e4d7991ce6d80efc2fab65c99c6b0a9be641dedef0177dac6893 WatchSource:0}: Error finding container 591829cd75a7e4d7991ce6d80efc2fab65c99c6b0a9be641dedef0177dac6893: Status 404 returned error can't find the container with id 591829cd75a7e4d7991ce6d80efc2fab65c99c6b0a9be641dedef0177dac6893 Dec 02 23:01:55 crc kubenswrapper[4696]: I1202 23:01:55.835721 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.007065 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.097821 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mxhn\" (UniqueName: \"kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.098087 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.098118 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.098207 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.098261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.098334 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc\") pod \"4c640324-4b47-498d-ae9c-8b66a5de9618\" (UID: \"4c640324-4b47-498d-ae9c-8b66a5de9618\") " Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.123446 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.168460 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config" (OuterVolumeSpecName: "config") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.170274 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn" (OuterVolumeSpecName: "kube-api-access-6mxhn") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "kube-api-access-6mxhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.203043 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mxhn\" (UniqueName: \"kubernetes.io/projected/4c640324-4b47-498d-ae9c-8b66a5de9618-kube-api-access-6mxhn\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.203089 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.203098 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.204924 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.232391 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.234819 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.250650 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c640324-4b47-498d-ae9c-8b66a5de9618" (UID: "4c640324-4b47-498d-ae9c-8b66a5de9618"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.308073 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.308318 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.308375 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c640324-4b47-498d-ae9c-8b66a5de9618-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.361947 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"73fa8b91-c646-46ae-8d9b-1ec7b93ae933","Type":"ContainerStarted","Data":"b31340bb1090d6f70eae8945f80494bd91c93a041d9f538f3e6e19406633fbfa"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.370549 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9cwtk" event={"ID":"d1f5ea7d-03ba-43ab-8863-9547b016bb0a","Type":"ContainerStarted","Data":"35ee60b6b7516dbeb7b9afc433010c0563b80e80aa7e5ad5dfe21a670e0e4290"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.370588 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9cwtk" event={"ID":"d1f5ea7d-03ba-43ab-8863-9547b016bb0a","Type":"ContainerStarted","Data":"e2ff65e5d4d34dde4de40e7d6677d2e6113a135e4da6ec6243cf8d5fb522781a"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.373167 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" event={"ID":"cbc9aaae-ff97-4d62-806d-6823b2cc6da8","Type":"ContainerStarted","Data":"3061dc201acc3c3c79b62772aa0b829fe5d3344f64967b419e3592ef1290561c"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.375640 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b2zfs" event={"ID":"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143","Type":"ContainerStarted","Data":"4ca326dcd7da11fa6df645954d4aa9f5a75e9be9c5c684269c406c4e8cefdddd"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.377817 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerStarted","Data":"fac2ed5656a4a994857c132d054c6384082ff81fc904f65653d08b1a8f1bc54c"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.379643 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnh2k" event={"ID":"359eca54-19ad-4e8d-b580-29a37d8f38c8","Type":"ContainerStarted","Data":"5d5bdda3021bc7f6c80fb39da47a76fc40a3b74b05462efbb48eae842d2f2071"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.381667 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" event={"ID":"4c640324-4b47-498d-ae9c-8b66a5de9618","Type":"ContainerDied","Data":"4efbb34bb4fc1c901eefe33dbd71317c3d16beae6c435940c2cd4dbf56e69f84"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.381699 4696 scope.go:117] "RemoveContainer" containerID="d597224c6c6f4dcf40e9831f9f65e7e5de88deea6dd39deee998be4ba86558f8" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.381842 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-982dr" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.390123 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5c6d7897-zxfvr" event={"ID":"19ee26f0-388e-41d0-9d56-7ce761896e0b","Type":"ContainerStarted","Data":"95f2f57f6c6fdcfb59efb29f0a75c3eeeb7c2db64f5d0fab289bc80e22227267"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.403183 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9cwtk" podStartSLOduration=5.403156238 podStartE2EDuration="5.403156238s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:56.396859569 +0000 UTC m=+1179.277539570" watchObservedRunningTime="2025-12-02 23:01:56.403156238 +0000 UTC m=+1179.283836239" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.408126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerStarted","Data":"d917e672283333f1239069c9349c4601619848cacf25bf9499bd9fe29da61dd7"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.409888 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerStarted","Data":"87cf43a03ebf6e2b35f6efb38cbdefccfddc47186f42b6cf1042f05a2e0d0a95"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.437644 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f7898bd8f-xjlfq" event={"ID":"a47e8eff-6d2e-4140-9513-bd2e3ff6a153","Type":"ContainerStarted","Data":"591829cd75a7e4d7991ce6d80efc2fab65c99c6b0a9be641dedef0177dac6893"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.446635 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api-log" containerID="cri-o://ec475b8931aca6fad9d336a1a17848f524e7b307de88e88447bb912d47dae6a6" gracePeriod=30 Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.447286 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fnjtl" event={"ID":"f066d064-95ba-42b3-ba9f-5e859533c93c","Type":"ContainerStarted","Data":"f1133c98fbeb59fe4866f96c42a5e18ee0687c6efd4b0a00350eaebdacbecca1"} Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.447357 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" containerID="cri-o://e0b4da2a357e7295795cdc78badc4aa2f484e96f7840899de5c9939518039049" gracePeriod=30 Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.465588 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": EOF" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.487295 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": EOF" Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.488544 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.496176 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-982dr"] Dec 02 23:01:56 crc kubenswrapper[4696]: I1202 23:01:56.602292 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.054769 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.470341 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c640324-4b47-498d-ae9c-8b66a5de9618" path="/var/lib/kubelet/pods/4c640324-4b47-498d-ae9c-8b66a5de9618/volumes" Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.522998 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49","Type":"ContainerStarted","Data":"47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54"} Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.554788 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bcb6c6d6f-rv22m" event={"ID":"043d1721-fd18-48b0-86b3-2831e34f775e","Type":"ContainerStarted","Data":"1356cc2753332e45130207005c1c67b7fc62f419c6a34cdbdcd3aa24fba04c04"} Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.569871 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerStarted","Data":"dea32683740f2249c5e43d39c13cdd71423e053db027d5638a2cbd4f9875d30d"} Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.577973 4696 generic.go:334] "Generic (PLEG): container finished" podID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerID="0a2b179520dbce686f4792963bffd2d611593f11a805aa18c553c02cd483dfd0" exitCode=0 Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.578046 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" event={"ID":"cbc9aaae-ff97-4d62-806d-6823b2cc6da8","Type":"ContainerDied","Data":"0a2b179520dbce686f4792963bffd2d611593f11a805aa18c553c02cd483dfd0"} Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.608692 4696 generic.go:334] "Generic (PLEG): container finished" podID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerID="ec475b8931aca6fad9d336a1a17848f524e7b307de88e88447bb912d47dae6a6" exitCode=143 Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.609774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerDied","Data":"ec475b8931aca6fad9d336a1a17848f524e7b307de88e88447bb912d47dae6a6"} Dec 02 23:01:57 crc kubenswrapper[4696]: I1202 23:01:57.839551 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=4.59536399 podStartE2EDuration="6.839535136s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="2025-12-02 23:01:53.70936038 +0000 UTC m=+1176.590040381" lastFinishedPulling="2025-12-02 23:01:55.953531526 +0000 UTC m=+1178.834211527" observedRunningTime="2025-12-02 23:01:57.833075312 +0000 UTC m=+1180.713755313" watchObservedRunningTime="2025-12-02 23:01:57.839535136 +0000 UTC m=+1180.720215137" Dec 02 23:01:58 crc kubenswrapper[4696]: I1202 23:01:58.638967 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerStarted","Data":"d30abf7da634441d89ce890ac9a19737e28b9c0c29f0f4e4ff9ce8ed407db12b"} Dec 02 23:01:58 crc kubenswrapper[4696]: I1202 23:01:58.639515 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-log" containerID="cri-o://dea32683740f2249c5e43d39c13cdd71423e053db027d5638a2cbd4f9875d30d" gracePeriod=30 Dec 02 23:01:58 crc kubenswrapper[4696]: I1202 23:01:58.640122 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-httpd" containerID="cri-o://d30abf7da634441d89ce890ac9a19737e28b9c0c29f0f4e4ff9ce8ed407db12b" gracePeriod=30 Dec 02 23:01:58 crc kubenswrapper[4696]: I1202 23:01:58.656052 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerStarted","Data":"d7c660807a714eebb09c333f9c4eba8c74111d369b051290641bd9d2ad6dc472"} Dec 02 23:01:58 crc kubenswrapper[4696]: I1202 23:01:58.667049 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.667029032 podStartE2EDuration="6.667029032s" podCreationTimestamp="2025-12-02 23:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:58.661358451 +0000 UTC m=+1181.542038452" watchObservedRunningTime="2025-12-02 23:01:58.667029032 +0000 UTC m=+1181.547709033" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.695007 4696 generic.go:334] "Generic (PLEG): container finished" podID="7f93d253-e832-436c-8839-1dbb94d58f86" containerID="abaf4949223eb7ae80e1cea1e57969b340c4095d0c83f03d0effb1be18e79890" exitCode=0 Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.695507 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6c4c" event={"ID":"7f93d253-e832-436c-8839-1dbb94d58f86","Type":"ContainerDied","Data":"abaf4949223eb7ae80e1cea1e57969b340c4095d0c83f03d0effb1be18e79890"} Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.702123 4696 generic.go:334] "Generic (PLEG): container finished" podID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerID="d30abf7da634441d89ce890ac9a19737e28b9c0c29f0f4e4ff9ce8ed407db12b" exitCode=143 Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.702158 4696 generic.go:334] "Generic (PLEG): container finished" podID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerID="dea32683740f2249c5e43d39c13cdd71423e053db027d5638a2cbd4f9875d30d" exitCode=143 Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.702189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerDied","Data":"d30abf7da634441d89ce890ac9a19737e28b9c0c29f0f4e4ff9ce8ed407db12b"} Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.702222 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerDied","Data":"dea32683740f2249c5e43d39c13cdd71423e053db027d5638a2cbd4f9875d30d"} Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.740143 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.841719 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845386 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845521 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845553 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845577 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845621 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845696 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9cz\" (UniqueName: \"kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.845761 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run\") pod \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\" (UID: \"084de0c6-eeff-49e4-b5b0-8010d88eb3a9\") " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.846642 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs" (OuterVolumeSpecName: "logs") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.846656 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.854114 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz" (OuterVolumeSpecName: "kube-api-access-nw9cz") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "kube-api-access-nw9cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.855418 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.856820 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts" (OuterVolumeSpecName: "scripts") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.899349 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.911858 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data" (OuterVolumeSpecName: "config-data") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.930965 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "084de0c6-eeff-49e4-b5b0-8010d88eb3a9" (UID: "084de0c6-eeff-49e4-b5b0-8010d88eb3a9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948377 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9cz\" (UniqueName: \"kubernetes.io/projected/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-kube-api-access-nw9cz\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948424 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948441 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948455 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948505 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948522 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948535 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.948547 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084de0c6-eeff-49e4-b5b0-8010d88eb3a9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:59 crc kubenswrapper[4696]: I1202 23:01:59.981905 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.050159 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.757155 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" event={"ID":"cbc9aaae-ff97-4d62-806d-6823b2cc6da8","Type":"ContainerStarted","Data":"984a87197dd761ce11008eeb165b37ee156f5be8fbe75aa87c71e67ad095363b"} Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.757655 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.802909 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerStarted","Data":"4bfb19b97cf856834313da9df5e9b696ff7b6748b5adb44897277ce5b4e4fd93"} Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.802978 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-log" containerID="cri-o://d7c660807a714eebb09c333f9c4eba8c74111d369b051290641bd9d2ad6dc472" gracePeriod=30 Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.803070 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-httpd" containerID="cri-o://4bfb19b97cf856834313da9df5e9b696ff7b6748b5adb44897277ce5b4e4fd93" gracePeriod=30 Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.814541 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" podStartSLOduration=8.814518985 podStartE2EDuration="8.814518985s" podCreationTimestamp="2025-12-02 23:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:00.803546183 +0000 UTC m=+1183.684226184" watchObservedRunningTime="2025-12-02 23:02:00.814518985 +0000 UTC m=+1183.695198986" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.859552 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.859515562 podStartE2EDuration="9.859515562s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:00.848966143 +0000 UTC m=+1183.729646144" watchObservedRunningTime="2025-12-02 23:02:00.859515562 +0000 UTC m=+1183.740195563" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.883238 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"73fa8b91-c646-46ae-8d9b-1ec7b93ae933","Type":"ContainerStarted","Data":"c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71"} Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.907142 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"084de0c6-eeff-49e4-b5b0-8010d88eb3a9","Type":"ContainerDied","Data":"d917e672283333f1239069c9349c4601619848cacf25bf9499bd9fe29da61dd7"} Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.907207 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:00 crc kubenswrapper[4696]: I1202 23:02:00.907255 4696 scope.go:117] "RemoveContainer" containerID="d30abf7da634441d89ce890ac9a19737e28b9c0c29f0f4e4ff9ce8ed407db12b" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.006498 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=6.476909124 podStartE2EDuration="10.006465523s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="2025-12-02 23:01:55.821604221 +0000 UTC m=+1178.702284212" lastFinishedPulling="2025-12-02 23:01:59.3511606 +0000 UTC m=+1182.231840611" observedRunningTime="2025-12-02 23:02:00.924021373 +0000 UTC m=+1183.804701374" watchObservedRunningTime="2025-12-02 23:02:01.006465523 +0000 UTC m=+1183.887145524" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.073895 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.109912 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.137607 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:01 crc kubenswrapper[4696]: E1202 23:02:01.139004 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c640324-4b47-498d-ae9c-8b66a5de9618" containerName="init" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139029 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c640324-4b47-498d-ae9c-8b66a5de9618" containerName="init" Dec 02 23:02:01 crc kubenswrapper[4696]: E1202 23:02:01.139196 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-httpd" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139205 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-httpd" Dec 02 23:02:01 crc kubenswrapper[4696]: E1202 23:02:01.139228 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-log" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139234 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-log" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139436 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-log" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139452 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c640324-4b47-498d-ae9c-8b66a5de9618" containerName="init" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.139469 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" containerName="glance-httpd" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.141099 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.149132 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.149466 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.163077 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285296 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285327 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285357 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285562 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285608 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.285948 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vnf\" (UniqueName: \"kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.286025 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388718 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vnf\" (UniqueName: \"kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388807 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388879 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.388956 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.389123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.389153 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.389469 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.389820 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.389969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.398796 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.409655 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.415595 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.434660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vnf\" (UniqueName: \"kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.455335 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.459661 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084de0c6-eeff-49e4-b5b0-8010d88eb3a9" path="/var/lib/kubelet/pods/084de0c6-eeff-49e4-b5b0-8010d88eb3a9/volumes" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.460131 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.504069 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.859676 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.866032 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.905620 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.908157 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.912850 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.935329 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:01 crc kubenswrapper[4696]: I1202 23:02:01.953324 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:01.999087 4696 generic.go:334] "Generic (PLEG): container finished" podID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerID="4bfb19b97cf856834313da9df5e9b696ff7b6748b5adb44897277ce5b4e4fd93" exitCode=143 Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:01.999134 4696 generic.go:334] "Generic (PLEG): container finished" podID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerID="d7c660807a714eebb09c333f9c4eba8c74111d369b051290641bd9d2ad6dc472" exitCode=143 Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.000291 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerDied","Data":"4bfb19b97cf856834313da9df5e9b696ff7b6748b5adb44897277ce5b4e4fd93"} Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.000327 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerDied","Data":"d7c660807a714eebb09c333f9c4eba8c74111d369b051290641bd9d2ad6dc472"} Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005250 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005314 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005435 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qrhg\" (UniqueName: \"kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005475 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005530 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005583 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.005618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.034782 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.064706 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.065843 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.099956 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.102144 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c784657c6-hdbrw"] Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.105246 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.107708 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.107784 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.108337 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.108360 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.108499 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qrhg\" (UniqueName: \"kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.108541 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.108587 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.109504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.112619 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.112657 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.116002 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.123445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.137671 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.140521 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c784657c6-hdbrw"] Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.155916 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qrhg\" (UniqueName: \"kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg\") pod \"horizon-7b448778f6-q69jq\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.204501 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.211217 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-secret-key\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.211692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-tls-certs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.211826 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-config-data\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.211930 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b414fc10-9d51-456b-aaa9-d6b4dd08af99-logs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.212128 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-scripts\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.212212 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n2j4\" (UniqueName: \"kubernetes.io/projected/b414fc10-9d51-456b-aaa9-d6b4dd08af99-kube-api-access-2n2j4\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.212350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-combined-ca-bundle\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.256695 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.275892 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.276428 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.308463 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315071 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-combined-ca-bundle\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315152 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-secret-key\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315186 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-tls-certs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-config-data\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315244 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b414fc10-9d51-456b-aaa9-d6b4dd08af99-logs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315310 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-scripts\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.315337 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n2j4\" (UniqueName: \"kubernetes.io/projected/b414fc10-9d51-456b-aaa9-d6b4dd08af99-kube-api-access-2n2j4\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.316393 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b414fc10-9d51-456b-aaa9-d6b4dd08af99-logs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.316711 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-scripts\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.324348 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b414fc10-9d51-456b-aaa9-d6b4dd08af99-config-data\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.325545 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-tls-certs\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.329116 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-combined-ca-bundle\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.329188 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b414fc10-9d51-456b-aaa9-d6b4dd08af99-horizon-secret-key\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.332408 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.337238 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n2j4\" (UniqueName: \"kubernetes.io/projected/b414fc10-9d51-456b-aaa9-d6b4dd08af99-kube-api-access-2n2j4\") pod \"horizon-7c784657c6-hdbrw\" (UID: \"b414fc10-9d51-456b-aaa9-d6b4dd08af99\") " pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.566825 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:02 crc kubenswrapper[4696]: I1202 23:02:02.859663 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": read tcp 10.217.0.2:47642->10.217.0.150:9322: read: connection reset by peer" Dec 02 23:02:03 crc kubenswrapper[4696]: I1202 23:02:03.066814 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 02 23:02:03 crc kubenswrapper[4696]: I1202 23:02:03.140878 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:04 crc kubenswrapper[4696]: I1202 23:02:04.033105 4696 generic.go:334] "Generic (PLEG): container finished" podID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerID="e0b4da2a357e7295795cdc78badc4aa2f484e96f7840899de5c9939518039049" exitCode=0 Dec 02 23:02:04 crc kubenswrapper[4696]: I1202 23:02:04.033354 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerName="watcher-decision-engine" containerID="cri-o://47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" gracePeriod=30 Dec 02 23:02:04 crc kubenswrapper[4696]: I1202 23:02:04.033431 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerDied","Data":"e0b4da2a357e7295795cdc78badc4aa2f484e96f7840899de5c9939518039049"} Dec 02 23:02:05 crc kubenswrapper[4696]: I1202 23:02:05.043441 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" containerID="cri-o://c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" gracePeriod=30 Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.057565 4696 generic.go:334] "Generic (PLEG): container finished" podID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" exitCode=0 Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.058253 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"73fa8b91-c646-46ae-8d9b-1ec7b93ae933","Type":"ContainerDied","Data":"c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71"} Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.060256 4696 generic.go:334] "Generic (PLEG): container finished" podID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerID="47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" exitCode=0 Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.060318 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49","Type":"ContainerDied","Data":"47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54"} Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.061525 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6c4c" event={"ID":"7f93d253-e832-436c-8839-1dbb94d58f86","Type":"ContainerDied","Data":"21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7"} Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.061558 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d78aa881e4da0ff34fd5e9a7b95504b513eb5200bf6e3e30689a438602baf7" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.064846 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604","Type":"ContainerDied","Data":"87cf43a03ebf6e2b35f6efb38cbdefccfddc47186f42b6cf1042f05a2e0d0a95"} Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.064873 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87cf43a03ebf6e2b35f6efb38cbdefccfddc47186f42b6cf1042f05a2e0d0a95" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.175586 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.183337 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.322247 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.322340 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.322366 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.322405 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323190 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323245 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323428 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sqgt\" (UniqueName: \"kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323441 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323471 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323705 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs" (OuterVolumeSpecName: "logs") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323839 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.323916 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.324000 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvkfc\" (UniqueName: \"kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.324074 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.324120 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle\") pod \"7f93d253-e832-436c-8839-1dbb94d58f86\" (UID: \"7f93d253-e832-436c-8839-1dbb94d58f86\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.324160 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts\") pod \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\" (UID: \"266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604\") " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.326397 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.326436 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.331300 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.333094 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.333511 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.333912 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt" (OuterVolumeSpecName: "kube-api-access-6sqgt") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "kube-api-access-6sqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.334064 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts" (OuterVolumeSpecName: "scripts") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.335854 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts" (OuterVolumeSpecName: "scripts") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.337517 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc" (OuterVolumeSpecName: "kube-api-access-dvkfc") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "kube-api-access-dvkfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.377042 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.382786 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.391825 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data" (OuterVolumeSpecName: "config-data") pod "7f93d253-e832-436c-8839-1dbb94d58f86" (UID: "7f93d253-e832-436c-8839-1dbb94d58f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.399897 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data" (OuterVolumeSpecName: "config-data") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.424433 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" (UID: "266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429066 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429215 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429273 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sqgt\" (UniqueName: \"kubernetes.io/projected/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-kube-api-access-6sqgt\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429292 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429326 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429340 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvkfc\" (UniqueName: \"kubernetes.io/projected/7f93d253-e832-436c-8839-1dbb94d58f86-kube-api-access-dvkfc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429353 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429365 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429379 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429391 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429402 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.429413 4696 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f93d253-e832-436c-8839-1dbb94d58f86-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.462272 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 23:02:06 crc kubenswrapper[4696]: I1202 23:02:06.543763 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.077346 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.077461 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6c4c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.136364 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.159181 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.173466 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.174124 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-httpd" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174151 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-httpd" Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.174187 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f93d253-e832-436c-8839-1dbb94d58f86" containerName="keystone-bootstrap" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174200 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f93d253-e832-436c-8839-1dbb94d58f86" containerName="keystone-bootstrap" Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.174214 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-log" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174230 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-log" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174495 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f93d253-e832-436c-8839-1dbb94d58f86" containerName="keystone-bootstrap" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174523 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-httpd" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.174545 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" containerName="glance-log" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.177860 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.180651 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.182632 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.196968 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.276565 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.277195 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.278049 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:07 crc kubenswrapper[4696]: E1202 23:02:07.278086 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361007 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361074 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361109 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361129 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361297 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp2z4\" (UniqueName: \"kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361614 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361659 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.361778 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.379714 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l6c4c"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.387330 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l6c4c"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.446690 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604" path="/var/lib/kubelet/pods/266b9a3e-8e6e-4130-8d6e-b7ad5cb9b604/volumes" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.449175 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f93d253-e832-436c-8839-1dbb94d58f86" path="/var/lib/kubelet/pods/7f93d253-e832-436c-8839-1dbb94d58f86/volumes" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464324 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp2z4\" (UniqueName: \"kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464424 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464460 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464553 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464649 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464678 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.464697 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.465417 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.466248 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.467179 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.474305 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.479090 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.489107 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.491374 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.491514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp2z4\" (UniqueName: \"kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.502257 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nml6c"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.505160 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.509273 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.509659 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.509962 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.510140 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.509950 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gwxg6" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.515915 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nml6c"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.529004 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669129 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669186 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669208 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4q8l\" (UniqueName: \"kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669348 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669402 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.669484 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.771885 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.771956 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.772011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.772074 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.772098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.772117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4q8l\" (UniqueName: \"kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.775983 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.784345 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.784435 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.784868 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.785053 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.790644 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4q8l\" (UniqueName: \"kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l\") pod \"keystone-bootstrap-nml6c\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.828813 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.890995 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.923116 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.971824 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:02:07 crc kubenswrapper[4696]: I1202 23:02:07.972628 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" containerID="cri-o://531befe7b94f2cfd73bc2f342146e69aa7c85e6670fb90f57a19ffae844ed69e" gracePeriod=10 Dec 02 23:02:08 crc kubenswrapper[4696]: I1202 23:02:08.441593 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 02 23:02:09 crc kubenswrapper[4696]: I1202 23:02:09.145837 4696 generic.go:334] "Generic (PLEG): container finished" podID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerID="531befe7b94f2cfd73bc2f342146e69aa7c85e6670fb90f57a19ffae844ed69e" exitCode=0 Dec 02 23:02:09 crc kubenswrapper[4696]: I1202 23:02:09.146089 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" event={"ID":"236e2c20-fc39-4643-904a-ab015e8c73ec","Type":"ContainerDied","Data":"531befe7b94f2cfd73bc2f342146e69aa7c85e6670fb90f57a19ffae844ed69e"} Dec 02 23:02:12 crc kubenswrapper[4696]: I1202 23:02:12.054780 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:02:12 crc kubenswrapper[4696]: E1202 23:02:12.275222 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:12 crc kubenswrapper[4696]: E1202 23:02:12.276035 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:12 crc kubenswrapper[4696]: E1202 23:02:12.276534 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:12 crc kubenswrapper[4696]: E1202 23:02:12.276579 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:13 crc kubenswrapper[4696]: I1202 23:02:13.441125 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 02 23:02:16 crc kubenswrapper[4696]: E1202 23:02:16.099777 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 23:02:16 crc kubenswrapper[4696]: E1202 23:02:16.100477 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n696h58bh58bh4h695h74h586h566h696h5cfh545h5ffh686h589h654h86h5c8h9bh595h7dh55dh5ddh576h5h649hf9hffh56h676h99h568h67q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hqx4h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5f7898bd8f-xjlfq_openstack(a47e8eff-6d2e-4140-9513-bd2e3ff6a153): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:16 crc kubenswrapper[4696]: E1202 23:02:16.103299 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5f7898bd8f-xjlfq" podUID="a47e8eff-6d2e-4140-9513-bd2e3ff6a153" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.055999 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.275542 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.276006 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.276506 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.276548 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.615734 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.616024 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mknpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-fnjtl_openstack(f066d064-95ba-42b3-ba9f-5e859533c93c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.618081 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-fnjtl" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.679189 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.679941 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c5hbch659h8ch5fch558h58bh657hc8h6ch697h56ch99h56ch564h5fh5c7h57ch68chb4h68ch67ch8h586h57ch7fhbhd9h689h554h87h5bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hn8px,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5bcb6c6d6f-rv22m_openstack(043d1721-fd18-48b0-86b3-2831e34f775e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.697612 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.697836 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67fh79h5c9h66dh5d8h5dfh76hf4h5f6h6dh57bhd8h5b4h64fhb6h5b9h7dh556h55ch549h87h87h5ffhfbhd7h564h58ch57bh586h5d7hd4h55fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdwjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-b5c6d7897-zxfvr_openstack(19ee26f0-388e-41d0-9d56-7ce761896e0b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.707591 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-b5c6d7897-zxfvr" podUID="19ee26f0-388e-41d0-9d56-7ce761896e0b" Dec 02 23:02:17 crc kubenswrapper[4696]: E1202 23:02:17.707701 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5bcb6c6d6f-rv22m" podUID="043d1721-fd18-48b0-86b3-2831e34f775e" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.738518 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.843902 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs\") pod \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.844451 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca\") pod \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.844692 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data\") pod \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.844802 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs" (OuterVolumeSpecName: "logs") pod "dc73e02f-1d29-4919-ac92-dec31ae5a5da" (UID: "dc73e02f-1d29-4919-ac92-dec31ae5a5da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.847893 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzqlf\" (UniqueName: \"kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf\") pod \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.847951 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle\") pod \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\" (UID: \"dc73e02f-1d29-4919-ac92-dec31ae5a5da\") " Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.849306 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc73e02f-1d29-4919-ac92-dec31ae5a5da-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.856485 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf" (OuterVolumeSpecName: "kube-api-access-xzqlf") pod "dc73e02f-1d29-4919-ac92-dec31ae5a5da" (UID: "dc73e02f-1d29-4919-ac92-dec31ae5a5da"). InnerVolumeSpecName "kube-api-access-xzqlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.935031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "dc73e02f-1d29-4919-ac92-dec31ae5a5da" (UID: "dc73e02f-1d29-4919-ac92-dec31ae5a5da"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.945261 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc73e02f-1d29-4919-ac92-dec31ae5a5da" (UID: "dc73e02f-1d29-4919-ac92-dec31ae5a5da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.951903 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzqlf\" (UniqueName: \"kubernetes.io/projected/dc73e02f-1d29-4919-ac92-dec31ae5a5da-kube-api-access-xzqlf\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.951948 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.951960 4696 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:17 crc kubenswrapper[4696]: I1202 23:02:17.962314 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data" (OuterVolumeSpecName: "config-data") pod "dc73e02f-1d29-4919-ac92-dec31ae5a5da" (UID: "dc73e02f-1d29-4919-ac92-dec31ae5a5da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.054330 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc73e02f-1d29-4919-ac92-dec31ae5a5da-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.252282 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"dc73e02f-1d29-4919-ac92-dec31ae5a5da","Type":"ContainerDied","Data":"cb99dc7d4eee6487e752956c40ce80efcccce3b44ae74fe98c67ceb24d4314cb"} Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.252354 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: E1202 23:02:18.254992 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-fnjtl" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.350829 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.377655 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.410722 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:18 crc kubenswrapper[4696]: E1202 23:02:18.411418 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.411440 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" Dec 02 23:02:18 crc kubenswrapper[4696]: E1202 23:02:18.411487 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api-log" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.411496 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api-log" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.411680 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api-log" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.411701 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.413441 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.417765 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.442935 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.462135 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.464419 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.464655 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.464733 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.464839 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.464952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntgf\" (UniqueName: \"kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.567485 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.567567 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.567598 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.567620 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.568511 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntgf\" (UniqueName: \"kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.569354 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.574996 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.575608 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.578132 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.586386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntgf\" (UniqueName: \"kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf\") pod \"watcher-api-0\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " pod="openstack/watcher-api-0" Dec 02 23:02:18 crc kubenswrapper[4696]: I1202 23:02:18.736065 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:19 crc kubenswrapper[4696]: I1202 23:02:19.445830 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" path="/var/lib/kubelet/pods/dc73e02f-1d29-4919-ac92-dec31ae5a5da/volumes" Dec 02 23:02:21 crc kubenswrapper[4696]: E1202 23:02:21.866818 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54 is running failed: container process not found" containerID="47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:02:21 crc kubenswrapper[4696]: E1202 23:02:21.868169 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54 is running failed: container process not found" containerID="47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:02:21 crc kubenswrapper[4696]: E1202 23:02:21.868612 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54 is running failed: container process not found" containerID="47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:02:21 crc kubenswrapper[4696]: E1202 23:02:21.868647 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerName="watcher-decision-engine" Dec 02 23:02:22 crc kubenswrapper[4696]: I1202 23:02:22.056807 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="dc73e02f-1d29-4919-ac92-dec31ae5a5da" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:02:22 crc kubenswrapper[4696]: E1202 23:02:22.276603 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:22 crc kubenswrapper[4696]: E1202 23:02:22.281358 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:22 crc kubenswrapper[4696]: E1202 23:02:22.287396 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:22 crc kubenswrapper[4696]: E1202 23:02:22.287477 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:22 crc kubenswrapper[4696]: I1202 23:02:22.979332 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:02:22 crc kubenswrapper[4696]: I1202 23:02:22.979410 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:02:23 crc kubenswrapper[4696]: I1202 23:02:23.441219 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Dec 02 23:02:23 crc kubenswrapper[4696]: I1202 23:02:23.446973 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.277455 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.278558 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.278865 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.278904 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.727772 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.728008 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26tll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hnh2k_openstack(359eca54-19ad-4e8d-b580-29a37d8f38c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:27 crc kubenswrapper[4696]: E1202 23:02:27.729985 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hnh2k" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.790408 4696 scope.go:117] "RemoveContainer" containerID="dea32683740f2249c5e43d39c13cdd71423e053db027d5638a2cbd4f9875d30d" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.908315 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.946513 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.948496 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.971496 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.984593 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:02:27 crc kubenswrapper[4696]: I1202 23:02:27.987065 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.008105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts\") pod \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.008261 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data\") pod \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.008295 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs\") pod \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.008357 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key\") pod \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.008404 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqx4h\" (UniqueName: \"kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h\") pod \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\" (UID: \"a47e8eff-6d2e-4140-9513-bd2e3ff6a153\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.009284 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts" (OuterVolumeSpecName: "scripts") pod "a47e8eff-6d2e-4140-9513-bd2e3ff6a153" (UID: "a47e8eff-6d2e-4140-9513-bd2e3ff6a153"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.009941 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data" (OuterVolumeSpecName: "config-data") pod "a47e8eff-6d2e-4140-9513-bd2e3ff6a153" (UID: "a47e8eff-6d2e-4140-9513-bd2e3ff6a153"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.010180 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs" (OuterVolumeSpecName: "logs") pod "a47e8eff-6d2e-4140-9513-bd2e3ff6a153" (UID: "a47e8eff-6d2e-4140-9513-bd2e3ff6a153"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.028526 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h" (OuterVolumeSpecName: "kube-api-access-hqx4h") pod "a47e8eff-6d2e-4140-9513-bd2e3ff6a153" (UID: "a47e8eff-6d2e-4140-9513-bd2e3ff6a153"). InnerVolumeSpecName "kube-api-access-hqx4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.028832 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a47e8eff-6d2e-4140-9513-bd2e3ff6a153" (UID: "a47e8eff-6d2e-4140-9513-bd2e3ff6a153"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.110963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts\") pod \"19ee26f0-388e-41d0-9d56-7ce761896e0b\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111027 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key\") pod \"19ee26f0-388e-41d0-9d56-7ce761896e0b\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111071 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key\") pod \"043d1721-fd18-48b0-86b3-2831e34f775e\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111111 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle\") pod \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111157 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs\") pod \"043d1721-fd18-48b0-86b3-2831e34f775e\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111196 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111220 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs\") pod \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111242 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488km\" (UniqueName: \"kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km\") pod \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111272 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts\") pod \"043d1721-fd18-48b0-86b3-2831e34f775e\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111311 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns2hc\" (UniqueName: \"kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111351 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs\") pod \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111373 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwfzh\" (UniqueName: \"kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh\") pod \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111398 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111448 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs\") pod \"19ee26f0-388e-41d0-9d56-7ce761896e0b\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111475 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdwjp\" (UniqueName: \"kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp\") pod \"19ee26f0-388e-41d0-9d56-7ce761896e0b\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111491 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data\") pod \"19ee26f0-388e-41d0-9d56-7ce761896e0b\" (UID: \"19ee26f0-388e-41d0-9d56-7ce761896e0b\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111507 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data\") pod \"043d1721-fd18-48b0-86b3-2831e34f775e\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111546 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111587 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data\") pod \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111637 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca\") pod \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\" (UID: \"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111684 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8px\" (UniqueName: \"kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px\") pod \"043d1721-fd18-48b0-86b3-2831e34f775e\" (UID: \"043d1721-fd18-48b0-86b3-2831e34f775e\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.111703 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.112245 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts" (OuterVolumeSpecName: "scripts") pod "043d1721-fd18-48b0-86b3-2831e34f775e" (UID: "043d1721-fd18-48b0-86b3-2831e34f775e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.112660 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts" (OuterVolumeSpecName: "scripts") pod "19ee26f0-388e-41d0-9d56-7ce761896e0b" (UID: "19ee26f0-388e-41d0-9d56-7ce761896e0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.113920 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.113954 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data\") pod \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.113985 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle\") pod \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\" (UID: \"73fa8b91-c646-46ae-8d9b-1ec7b93ae933\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115715 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115770 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115784 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115794 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115846 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115861 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqx4h\" (UniqueName: \"kubernetes.io/projected/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-kube-api-access-hqx4h\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.115872 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a47e8eff-6d2e-4140-9513-bd2e3ff6a153-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.118970 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "19ee26f0-388e-41d0-9d56-7ce761896e0b" (UID: "19ee26f0-388e-41d0-9d56-7ce761896e0b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.122276 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs" (OuterVolumeSpecName: "logs") pod "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" (UID: "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.122598 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs" (OuterVolumeSpecName: "logs") pod "19ee26f0-388e-41d0-9d56-7ce761896e0b" (UID: "19ee26f0-388e-41d0-9d56-7ce761896e0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.123545 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data" (OuterVolumeSpecName: "config-data") pod "043d1721-fd18-48b0-86b3-2831e34f775e" (UID: "043d1721-fd18-48b0-86b3-2831e34f775e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.124064 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data" (OuterVolumeSpecName: "config-data") pod "19ee26f0-388e-41d0-9d56-7ce761896e0b" (UID: "19ee26f0-388e-41d0-9d56-7ce761896e0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.124593 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh" (OuterVolumeSpecName: "kube-api-access-zwfzh") pod "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" (UID: "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49"). InnerVolumeSpecName "kube-api-access-zwfzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.126339 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp" (OuterVolumeSpecName: "kube-api-access-pdwjp") pod "19ee26f0-388e-41d0-9d56-7ce761896e0b" (UID: "19ee26f0-388e-41d0-9d56-7ce761896e0b"). InnerVolumeSpecName "kube-api-access-pdwjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.128902 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs" (OuterVolumeSpecName: "logs") pod "73fa8b91-c646-46ae-8d9b-1ec7b93ae933" (UID: "73fa8b91-c646-46ae-8d9b-1ec7b93ae933"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.129720 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs" (OuterVolumeSpecName: "logs") pod "043d1721-fd18-48b0-86b3-2831e34f775e" (UID: "043d1721-fd18-48b0-86b3-2831e34f775e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.133632 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "043d1721-fd18-48b0-86b3-2831e34f775e" (UID: "043d1721-fd18-48b0-86b3-2831e34f775e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.136306 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px" (OuterVolumeSpecName: "kube-api-access-hn8px") pod "043d1721-fd18-48b0-86b3-2831e34f775e" (UID: "043d1721-fd18-48b0-86b3-2831e34f775e"). InnerVolumeSpecName "kube-api-access-hn8px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.136850 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc" (OuterVolumeSpecName: "kube-api-access-ns2hc") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "kube-api-access-ns2hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.146552 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km" (OuterVolumeSpecName: "kube-api-access-488km") pod "73fa8b91-c646-46ae-8d9b-1ec7b93ae933" (UID: "73fa8b91-c646-46ae-8d9b-1ec7b93ae933"). InnerVolumeSpecName "kube-api-access-488km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.189228 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73fa8b91-c646-46ae-8d9b-1ec7b93ae933" (UID: "73fa8b91-c646-46ae-8d9b-1ec7b93ae933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.189285 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" (UID: "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.201041 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.201182 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" (UID: "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.220091 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.220695 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config" (OuterVolumeSpecName: "config") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.221322 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") pod \"236e2c20-fc39-4643-904a-ab015e8c73ec\" (UID: \"236e2c20-fc39-4643-904a-ab015e8c73ec\") " Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222463 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222494 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwfzh\" (UniqueName: \"kubernetes.io/projected/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-kube-api-access-zwfzh\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222507 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222522 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ee26f0-388e-41d0-9d56-7ce761896e0b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222531 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdwjp\" (UniqueName: \"kubernetes.io/projected/19ee26f0-388e-41d0-9d56-7ce761896e0b-kube-api-access-pdwjp\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222541 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19ee26f0-388e-41d0-9d56-7ce761896e0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222555 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/043d1721-fd18-48b0-86b3-2831e34f775e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222566 4696 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222581 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8px\" (UniqueName: \"kubernetes.io/projected/043d1721-fd18-48b0-86b3-2831e34f775e-kube-api-access-hn8px\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222593 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222608 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: W1202 23:02:28.222537 4696 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/236e2c20-fc39-4643-904a-ab015e8c73ec/volumes/kubernetes.io~configmap/config Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222620 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19ee26f0-388e-41d0-9d56-7ce761896e0b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222627 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config" (OuterVolumeSpecName: "config") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222635 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/043d1721-fd18-48b0-86b3-2831e34f775e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222648 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222660 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043d1721-fd18-48b0-86b3-2831e34f775e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222670 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222679 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222694 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488km\" (UniqueName: \"kubernetes.io/projected/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-kube-api-access-488km\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.222708 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns2hc\" (UniqueName: \"kubernetes.io/projected/236e2c20-fc39-4643-904a-ab015e8c73ec-kube-api-access-ns2hc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.224099 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.239346 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data" (OuterVolumeSpecName: "config-data") pod "73fa8b91-c646-46ae-8d9b-1ec7b93ae933" (UID: "73fa8b91-c646-46ae-8d9b-1ec7b93ae933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.239340 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data" (OuterVolumeSpecName: "config-data") pod "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" (UID: "e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.242244 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "236e2c20-fc39-4643-904a-ab015e8c73ec" (UID: "236e2c20-fc39-4643-904a-ab015e8c73ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.323591 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.323662 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.323674 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/236e2c20-fc39-4643-904a-ab015e8c73ec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.323689 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fa8b91-c646-46ae-8d9b-1ec7b93ae933-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.388225 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerStarted","Data":"b31e42b5e1860bbba4d28ace4da0d4572089dba8db721bab7b4bedbc5618e9b7"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.391969 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bcb6c6d6f-rv22m" event={"ID":"043d1721-fd18-48b0-86b3-2831e34f775e","Type":"ContainerDied","Data":"1356cc2753332e45130207005c1c67b7fc62f419c6a34cdbdcd3aa24fba04c04"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.392066 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcb6c6d6f-rv22m" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.396314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5c6d7897-zxfvr" event={"ID":"19ee26f0-388e-41d0-9d56-7ce761896e0b","Type":"ContainerDied","Data":"95f2f57f6c6fdcfb59efb29f0a75c3eeeb7c2db64f5d0fab289bc80e22227267"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.396325 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5c6d7897-zxfvr" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.401083 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f7898bd8f-xjlfq" event={"ID":"a47e8eff-6d2e-4140-9513-bd2e3ff6a153","Type":"ContainerDied","Data":"591829cd75a7e4d7991ce6d80efc2fab65c99c6b0a9be641dedef0177dac6893"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.402145 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f7898bd8f-xjlfq" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.404666 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.404693 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"73fa8b91-c646-46ae-8d9b-1ec7b93ae933","Type":"ContainerDied","Data":"b31340bb1090d6f70eae8945f80494bd91c93a041d9f538f3e6e19406633fbfa"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.408527 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.408637 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49","Type":"ContainerDied","Data":"5503afd7a43a7137567621c4682cb5bad013692827c71c08c963230f187b68e2"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.412784 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" event={"ID":"236e2c20-fc39-4643-904a-ab015e8c73ec","Type":"ContainerDied","Data":"39dcfa4dc767f0c9bfe664fdcfbd394ebddaf516375eff4eb2cee161340d9ba1"} Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.412891 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" Dec 02 23:02:28 crc kubenswrapper[4696]: E1202 23:02:28.414731 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hnh2k" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.441485 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-fvvbx" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.470901 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.502784 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.542082 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.552356 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b5c6d7897-zxfvr"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.566607 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: E1202 23:02:28.567130 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567150 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" Dec 02 23:02:28 crc kubenswrapper[4696]: E1202 23:02:28.567171 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567178 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:28 crc kubenswrapper[4696]: E1202 23:02:28.567188 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="init" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567198 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="init" Dec 02 23:02:28 crc kubenswrapper[4696]: E1202 23:02:28.567221 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerName="watcher-decision-engine" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567230 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerName="watcher-decision-engine" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567425 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" containerName="dnsmasq-dns" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567435 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" containerName="watcher-applier" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.567445 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" containerName="watcher-decision-engine" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.570045 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.572273 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.585064 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.608281 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.621203 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.634932 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.636404 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.639822 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.660288 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.669360 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.676642 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f7898bd8f-xjlfq"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.692238 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.699788 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bcb6c6d6f-rv22m"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.707339 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.714654 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fvvbx"] Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745515 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745582 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745650 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745757 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnc74\" (UniqueName: \"kubernetes.io/projected/88764a17-d8c0-447f-923a-4afd6c522e43-kube-api-access-lnc74\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745886 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745932 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-657n6\" (UniqueName: \"kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745962 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.745995 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-config-data\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.746021 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88764a17-d8c0-447f-923a-4afd6c522e43-logs\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.847957 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848034 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848067 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848137 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnc74\" (UniqueName: \"kubernetes.io/projected/88764a17-d8c0-447f-923a-4afd6c522e43-kube-api-access-lnc74\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848221 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848255 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-657n6\" (UniqueName: \"kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848282 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-config-data\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848363 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88764a17-d8c0-447f-923a-4afd6c522e43-logs\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.848976 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88764a17-d8c0-447f-923a-4afd6c522e43-logs\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.853080 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.853450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.855893 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.856615 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.865708 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.868679 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88764a17-d8c0-447f-923a-4afd6c522e43-config-data\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.870632 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-657n6\" (UniqueName: \"kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6\") pod \"watcher-decision-engine-0\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.874308 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnc74\" (UniqueName: \"kubernetes.io/projected/88764a17-d8c0-447f-923a-4afd6c522e43-kube-api-access-lnc74\") pod \"watcher-applier-0\" (UID: \"88764a17-d8c0-447f-923a-4afd6c522e43\") " pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.907808 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:02:28 crc kubenswrapper[4696]: I1202 23:02:28.959510 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:29 crc kubenswrapper[4696]: E1202 23:02:29.371083 4696 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 23:02:29 crc kubenswrapper[4696]: E1202 23:02:29.371271 4696 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5nlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b2zfs_openstack(b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:02:29 crc kubenswrapper[4696]: E1202 23:02:29.373276 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b2zfs" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.375596 4696 scope.go:117] "RemoveContainer" containerID="e0b4da2a357e7295795cdc78badc4aa2f484e96f7840899de5c9939518039049" Dec 02 23:02:29 crc kubenswrapper[4696]: E1202 23:02:29.432098 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-b2zfs" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.456918 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043d1721-fd18-48b0-86b3-2831e34f775e" path="/var/lib/kubelet/pods/043d1721-fd18-48b0-86b3-2831e34f775e/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.459387 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ee26f0-388e-41d0-9d56-7ce761896e0b" path="/var/lib/kubelet/pods/19ee26f0-388e-41d0-9d56-7ce761896e0b/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.460218 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236e2c20-fc39-4643-904a-ab015e8c73ec" path="/var/lib/kubelet/pods/236e2c20-fc39-4643-904a-ab015e8c73ec/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.461377 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73fa8b91-c646-46ae-8d9b-1ec7b93ae933" path="/var/lib/kubelet/pods/73fa8b91-c646-46ae-8d9b-1ec7b93ae933/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.463449 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47e8eff-6d2e-4140-9513-bd2e3ff6a153" path="/var/lib/kubelet/pods/a47e8eff-6d2e-4140-9513-bd2e3ff6a153/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.465264 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49" path="/var/lib/kubelet/pods/e07c8b14-2cf9-4a6c-8fb4-d9f362d85f49/volumes" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.503772 4696 scope.go:117] "RemoveContainer" containerID="ec475b8931aca6fad9d336a1a17848f524e7b307de88e88447bb912d47dae6a6" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.588145 4696 scope.go:117] "RemoveContainer" containerID="c8e5033c289c85268f1ce5c69f0401a769627fa2f63f5d375832254d76857a71" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.635283 4696 scope.go:117] "RemoveContainer" containerID="47055e3f201494569f436f62a7822eb485cfef0a1a3f5efa24b582a2517fbc54" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.726249 4696 scope.go:117] "RemoveContainer" containerID="531befe7b94f2cfd73bc2f342146e69aa7c85e6670fb90f57a19ffae844ed69e" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.808625 4696 scope.go:117] "RemoveContainer" containerID="76cee049afce7aac085cabe9831acad5a5772a29908ae685c8c6ad1a4c0da687" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.827703 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nml6c"] Dec 02 23:02:29 crc kubenswrapper[4696]: W1202 23:02:29.839530 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598ec97e_43b1_4d80_866e_4106d1622140.slice/crio-31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a WatchSource:0}: Error finding container 31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a: Status 404 returned error can't find the container with id 31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.848614 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.911961 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c784657c6-hdbrw"] Dec 02 23:02:29 crc kubenswrapper[4696]: I1202 23:02:29.932680 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.075104 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:02:30 crc kubenswrapper[4696]: W1202 23:02:30.097345 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88764a17_d8c0_447f_923a_4afd6c522e43.slice/crio-705f544c7c01a65cd02fee958b9b69c124f478c6f8bcb72665cddbc68ccc7d86 WatchSource:0}: Error finding container 705f544c7c01a65cd02fee958b9b69c124f478c6f8bcb72665cddbc68ccc7d86: Status 404 returned error can't find the container with id 705f544c7c01a65cd02fee958b9b69c124f478c6f8bcb72665cddbc68ccc7d86 Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.256091 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.290215 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:02:30 crc kubenswrapper[4696]: W1202 23:02:30.290940 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22eeafc0_fbad_4836_9c47_0f3e01784d91.slice/crio-1be4358aab9c8bb3c3a293040846df0feedb346988decf0e6c80e02e16342a37 WatchSource:0}: Error finding container 1be4358aab9c8bb3c3a293040846df0feedb346988decf0e6c80e02e16342a37: Status 404 returned error can't find the container with id 1be4358aab9c8bb3c3a293040846df0feedb346988decf0e6c80e02e16342a37 Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.468099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nml6c" event={"ID":"598ec97e-43b1-4d80-866e-4106d1622140","Type":"ContainerStarted","Data":"ad01d3112109f96a6d9cd82bace3c2485a8f8a27bf776427a1996a19ea50e8ee"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.468151 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nml6c" event={"ID":"598ec97e-43b1-4d80-866e-4106d1622140","Type":"ContainerStarted","Data":"31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.474306 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ac42795-270d-403e-8622-d7592294ddff","Type":"ContainerStarted","Data":"e7bed1f3c2c5569f7f5929fd065d999067d2c74c2134ef9f04800446c905ffcd"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.479478 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerStarted","Data":"c122c41870303f1e7fbefb18bf35dcdb71c1c717ec6a3131a8e6730b60d07a3b"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.483465 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerStarted","Data":"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.487387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"88764a17-d8c0-447f-923a-4afd6c522e43","Type":"ContainerStarted","Data":"705f544c7c01a65cd02fee958b9b69c124f478c6f8bcb72665cddbc68ccc7d86"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.506443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c784657c6-hdbrw" event={"ID":"b414fc10-9d51-456b-aaa9-d6b4dd08af99","Type":"ContainerStarted","Data":"33129bee3557ec1ec3f6327e6c79747960bc04e9646576f6ab93f3e03765d918"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.506524 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c784657c6-hdbrw" event={"ID":"b414fc10-9d51-456b-aaa9-d6b4dd08af99","Type":"ContainerStarted","Data":"ebbfbce507d446a6eeae4ccbbcbf887c67488a924c5344010e99fe0be848ec77"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.520974 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerStarted","Data":"1be4358aab9c8bb3c3a293040846df0feedb346988decf0e6c80e02e16342a37"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.530460 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerStarted","Data":"db7c41afc2c4141b03c10301aee7cd5a2b37a58c34385a505278ac8b3e2f3bf7"} Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.534049 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:30 crc kubenswrapper[4696]: W1202 23:02:30.541785 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e3be6e_0ab2_414e_98ca_9c2ba29f485d.slice/crio-82e1a845c0ab2f6d62bb005aaf49e84a3f8e2c4e2a04a30af5b8c6ce784a501f WatchSource:0}: Error finding container 82e1a845c0ab2f6d62bb005aaf49e84a3f8e2c4e2a04a30af5b8c6ce784a501f: Status 404 returned error can't find the container with id 82e1a845c0ab2f6d62bb005aaf49e84a3f8e2c4e2a04a30af5b8c6ce784a501f Dec 02 23:02:30 crc kubenswrapper[4696]: I1202 23:02:30.544624 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nml6c" podStartSLOduration=23.544597528 podStartE2EDuration="23.544597528s" podCreationTimestamp="2025-12-02 23:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:30.501141135 +0000 UTC m=+1213.381821156" watchObservedRunningTime="2025-12-02 23:02:30.544597528 +0000 UTC m=+1213.425277539" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.555865 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerStarted","Data":"82e1a845c0ab2f6d62bb005aaf49e84a3f8e2c4e2a04a30af5b8c6ce784a501f"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.574191 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerStarted","Data":"3fc870a0e795fd1c83f2b237de124f02e7d049e74c303b3b058fbcf17f5d2c90"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.582613 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c784657c6-hdbrw" event={"ID":"b414fc10-9d51-456b-aaa9-d6b4dd08af99","Type":"ContainerStarted","Data":"cf6543e79ba9e80e3676b0348fef6e5e6b0493c35e19379aa9eac30a8a0106eb"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.589443 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerStarted","Data":"44a9e254b6c1dfc6ef34619e2a1d3179f8a6c9b5842f68596e4be12198f93f5c"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.589547 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerStarted","Data":"454e5848a04308a2a3ae9f54efdaabd6b80f1ac9cbecc24ca0b89cdd25107675"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.590332 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.596305 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"88764a17-d8c0-447f-923a-4afd6c522e43","Type":"ContainerStarted","Data":"a2d4c8b39f23dc222e09ba1ef407ebc01681dbfefeefc311643d8acd2eaffba7"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.613165 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ac42795-270d-403e-8622-d7592294ddff","Type":"ContainerStarted","Data":"7388711da6840e9ba77036b562f6a290654e7e1a0679fa5fa6f2d37c4d7f8816"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.624075 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c784657c6-hdbrw" podStartSLOduration=29.624052517 podStartE2EDuration="29.624052517s" podCreationTimestamp="2025-12-02 23:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:31.609173874 +0000 UTC m=+1214.489853875" watchObservedRunningTime="2025-12-02 23:02:31.624052517 +0000 UTC m=+1214.504732518" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.638861 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerStarted","Data":"d15abc85ab3659d4083f5707c2acb2b3368eb8c46e42748f7c5227e701ce835e"} Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.650509 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=13.650481907 podStartE2EDuration="13.650481907s" podCreationTimestamp="2025-12-02 23:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:31.631374554 +0000 UTC m=+1214.512054555" watchObservedRunningTime="2025-12-02 23:02:31.650481907 +0000 UTC m=+1214.531161908" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.656246 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.65623792 podStartE2EDuration="3.65623792s" podCreationTimestamp="2025-12-02 23:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:31.65377419 +0000 UTC m=+1214.534454201" watchObservedRunningTime="2025-12-02 23:02:31.65623792 +0000 UTC m=+1214.536917921" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.681088 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.681067745 podStartE2EDuration="3.681067745s" podCreationTimestamp="2025-12-02 23:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:31.676105734 +0000 UTC m=+1214.556785735" watchObservedRunningTime="2025-12-02 23:02:31.681067745 +0000 UTC m=+1214.561747746" Dec 02 23:02:31 crc kubenswrapper[4696]: I1202 23:02:31.708544 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b448778f6-q69jq" podStartSLOduration=28.883022121 podStartE2EDuration="30.708517874s" podCreationTimestamp="2025-12-02 23:02:01 +0000 UTC" firstStartedPulling="2025-12-02 23:02:27.824999549 +0000 UTC m=+1210.705679550" lastFinishedPulling="2025-12-02 23:02:29.650495302 +0000 UTC m=+1212.531175303" observedRunningTime="2025-12-02 23:02:31.701630278 +0000 UTC m=+1214.582310279" watchObservedRunningTime="2025-12-02 23:02:31.708517874 +0000 UTC m=+1214.589197875" Dec 02 23:02:32 crc kubenswrapper[4696]: I1202 23:02:32.256836 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:32 crc kubenswrapper[4696]: I1202 23:02:32.256902 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:02:32 crc kubenswrapper[4696]: I1202 23:02:32.567363 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:32 crc kubenswrapper[4696]: I1202 23:02:32.567973 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.669105 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerStarted","Data":"b6eebd076be285d8ba1700694280ad120d92acfd7695db28a15a7a2959264404"} Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.671891 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.672082 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerStarted","Data":"fb059ff59a354b1671b46b5170a0aefda44680c6afc9df05c28ba6ecd892cfbc"} Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.713238 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.713201312 podStartE2EDuration="26.713201312s" podCreationTimestamp="2025-12-02 23:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:33.711694179 +0000 UTC m=+1216.592374180" watchObservedRunningTime="2025-12-02 23:02:33.713201312 +0000 UTC m=+1216.593881313" Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.738406 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.907983 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 02 23:02:33 crc kubenswrapper[4696]: I1202 23:02:33.943500 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.684355 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fnjtl" event={"ID":"f066d064-95ba-42b3-ba9f-5e859533c93c","Type":"ContainerStarted","Data":"171777d49f633d478ff4c9cd39f56f6babed8df8c66ecb007794fd2897eff333"} Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.687985 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerStarted","Data":"28c0d3270acfd299dcf0b3d5ee2db658a0cfadddd5a6ab97c3a25a9049757c53"} Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.688190 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-log" containerID="cri-o://b6eebd076be285d8ba1700694280ad120d92acfd7695db28a15a7a2959264404" gracePeriod=30 Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.688581 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-httpd" containerID="cri-o://28c0d3270acfd299dcf0b3d5ee2db658a0cfadddd5a6ab97c3a25a9049757c53" gracePeriod=30 Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.692884 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerStarted","Data":"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8"} Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.739266 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fnjtl" podStartSLOduration=4.908685406 podStartE2EDuration="42.739237944s" podCreationTimestamp="2025-12-02 23:01:52 +0000 UTC" firstStartedPulling="2025-12-02 23:01:55.784861459 +0000 UTC m=+1178.665541460" lastFinishedPulling="2025-12-02 23:02:33.615413977 +0000 UTC m=+1216.496093998" observedRunningTime="2025-12-02 23:02:34.713949987 +0000 UTC m=+1217.594629988" watchObservedRunningTime="2025-12-02 23:02:34.739237944 +0000 UTC m=+1217.619917945" Dec 02 23:02:34 crc kubenswrapper[4696]: I1202 23:02:34.761849 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.761815205 podStartE2EDuration="33.761815205s" podCreationTimestamp="2025-12-02 23:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:34.738993447 +0000 UTC m=+1217.619673448" watchObservedRunningTime="2025-12-02 23:02:34.761815205 +0000 UTC m=+1217.642495206" Dec 02 23:02:35 crc kubenswrapper[4696]: I1202 23:02:35.708481 4696 generic.go:334] "Generic (PLEG): container finished" podID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerID="b6eebd076be285d8ba1700694280ad120d92acfd7695db28a15a7a2959264404" exitCode=143 Dec 02 23:02:35 crc kubenswrapper[4696]: I1202 23:02:35.708654 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerDied","Data":"b6eebd076be285d8ba1700694280ad120d92acfd7695db28a15a7a2959264404"} Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.736973 4696 generic.go:334] "Generic (PLEG): container finished" podID="598ec97e-43b1-4d80-866e-4106d1622140" containerID="ad01d3112109f96a6d9cd82bace3c2485a8f8a27bf776427a1996a19ea50e8ee" exitCode=0 Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.737074 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nml6c" event={"ID":"598ec97e-43b1-4d80-866e-4106d1622140","Type":"ContainerDied","Data":"ad01d3112109f96a6d9cd82bace3c2485a8f8a27bf776427a1996a19ea50e8ee"} Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.743087 4696 generic.go:334] "Generic (PLEG): container finished" podID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerID="28c0d3270acfd299dcf0b3d5ee2db658a0cfadddd5a6ab97c3a25a9049757c53" exitCode=143 Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.743158 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerDied","Data":"28c0d3270acfd299dcf0b3d5ee2db658a0cfadddd5a6ab97c3a25a9049757c53"} Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.829943 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.830014 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.830028 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.830453 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.878493 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:37 crc kubenswrapper[4696]: I1202 23:02:37.925803 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.737828 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.742541 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.766986 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.908233 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.961151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:38 crc kubenswrapper[4696]: I1202 23:02:38.966337 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.038040 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.412521 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539518 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4q8l\" (UniqueName: \"kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539687 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539827 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539872 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.539975 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data\") pod \"598ec97e-43b1-4d80-866e-4106d1622140\" (UID: \"598ec97e-43b1-4d80-866e-4106d1622140\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.550611 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l" (OuterVolumeSpecName: "kube-api-access-f4q8l") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "kube-api-access-f4q8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.565703 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.574913 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts" (OuterVolumeSpecName: "scripts") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.577138 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.625499 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data" (OuterVolumeSpecName: "config-data") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.625922 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "598ec97e-43b1-4d80-866e-4106d1622140" (UID: "598ec97e-43b1-4d80-866e-4106d1622140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642025 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642067 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642083 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4q8l\" (UniqueName: \"kubernetes.io/projected/598ec97e-43b1-4d80-866e-4106d1622140-kube-api-access-f4q8l\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642097 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642109 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.642118 4696 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/598ec97e-43b1-4d80-866e-4106d1622140-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.721629 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744086 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744281 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744318 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744374 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9vnf\" (UniqueName: \"kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744401 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744480 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744518 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.744547 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run\") pod \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\" (UID: \"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d\") " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.746121 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs" (OuterVolumeSpecName: "logs") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.746973 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.758096 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf" (OuterVolumeSpecName: "kube-api-access-t9vnf") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "kube-api-access-t9vnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.758563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts" (OuterVolumeSpecName: "scripts") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.764021 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.807182 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nml6c" event={"ID":"598ec97e-43b1-4d80-866e-4106d1622140","Type":"ContainerDied","Data":"31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a"} Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.807238 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31618b069b8c886a54ccc72513fa633f872ff8cfa734a47f57e2ae167c305a2a" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.807319 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nml6c" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.816349 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data" (OuterVolumeSpecName: "config-data") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.826244 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.833141 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.833816 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8e3be6e-0ab2-414e-98ca-9c2ba29f485d","Type":"ContainerDied","Data":"82e1a845c0ab2f6d62bb005aaf49e84a3f8e2c4e2a04a30af5b8c6ce784a501f"} Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.833854 4696 scope.go:117] "RemoveContainer" containerID="28c0d3270acfd299dcf0b3d5ee2db658a0cfadddd5a6ab97c3a25a9049757c53" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.835091 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.855093 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861082 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861124 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861146 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9vnf\" (UniqueName: \"kubernetes.io/projected/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-kube-api-access-t9vnf\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861158 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861170 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.861182 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.899178 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76d45f5d76-ptzqb"] Dec 02 23:02:39 crc kubenswrapper[4696]: E1202 23:02:39.899766 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-log" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.899783 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-log" Dec 02 23:02:39 crc kubenswrapper[4696]: E1202 23:02:39.899811 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-httpd" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.899818 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-httpd" Dec 02 23:02:39 crc kubenswrapper[4696]: E1202 23:02:39.899828 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598ec97e-43b1-4d80-866e-4106d1622140" containerName="keystone-bootstrap" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.899836 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="598ec97e-43b1-4d80-866e-4106d1622140" containerName="keystone-bootstrap" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.900073 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-log" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.900110 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" containerName="glance-httpd" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.900125 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="598ec97e-43b1-4d80-866e-4106d1622140" containerName="keystone-bootstrap" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.900967 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.909435 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.909983 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gwxg6" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.910181 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.910394 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.910536 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.910674 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.931114 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.934437 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.942524 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" (UID: "f8e3be6e-0ab2-414e-98ca-9c2ba29f485d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.959263 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d45f5d76-ptzqb"] Dec 02 23:02:39 crc kubenswrapper[4696]: I1202 23:02:39.991355 4696 scope.go:117] "RemoveContainer" containerID="b6eebd076be285d8ba1700694280ad120d92acfd7695db28a15a7a2959264404" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:39.999979 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.000018 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.061422 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.103873 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-config-data\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104067 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-internal-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104117 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-public-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104229 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2psl\" (UniqueName: \"kubernetes.io/projected/6cc29833-0849-46ec-bc06-1c980ec2dc02-kube-api-access-l2psl\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-credential-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104317 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-fernet-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104370 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-combined-ca-bundle\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.104394 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-scripts\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.213965 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-public-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.214548 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2psl\" (UniqueName: \"kubernetes.io/projected/6cc29833-0849-46ec-bc06-1c980ec2dc02-kube-api-access-l2psl\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.214628 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-credential-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.214703 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-fernet-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.214799 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-combined-ca-bundle\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.215068 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-scripts\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.215503 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-config-data\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.215660 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-internal-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.225293 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-fernet-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.232282 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-internal-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.234090 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-config-data\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.234539 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-combined-ca-bundle\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.236982 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-scripts\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.239302 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-public-tls-certs\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.247361 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6cc29833-0849-46ec-bc06-1c980ec2dc02-credential-keys\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.252796 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2psl\" (UniqueName: \"kubernetes.io/projected/6cc29833-0849-46ec-bc06-1c980ec2dc02-kube-api-access-l2psl\") pod \"keystone-76d45f5d76-ptzqb\" (UID: \"6cc29833-0849-46ec-bc06-1c980ec2dc02\") " pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.263336 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.294037 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.305673 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.307705 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.313568 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.313842 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.315505 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.424821 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjkgl\" (UniqueName: \"kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.424910 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.424954 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.425021 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.425051 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.425087 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.425121 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.425148 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527062 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjkgl\" (UniqueName: \"kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527131 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527189 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527270 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527314 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527379 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527414 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527587 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.527988 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.528532 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.529847 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.533778 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.536781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.548683 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.549422 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.549670 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.565421 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjkgl\" (UniqueName: \"kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.579549 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " pod="openstack/glance-default-external-api-0" Dec 02 23:02:40 crc kubenswrapper[4696]: I1202 23:02:40.661566 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.206806 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d45f5d76-ptzqb"] Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.218588 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:02:41 crc kubenswrapper[4696]: W1202 23:02:41.221622 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cc29833_0849_46ec_bc06_1c980ec2dc02.slice/crio-bdce724a2081dc5cfcf9b684a95e664e162db35660a93874eb0d34c899d7c0ca WatchSource:0}: Error finding container bdce724a2081dc5cfcf9b684a95e664e162db35660a93874eb0d34c899d7c0ca: Status 404 returned error can't find the container with id bdce724a2081dc5cfcf9b684a95e664e162db35660a93874eb0d34c899d7c0ca Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.446054 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e3be6e-0ab2-414e-98ca-9c2ba29f485d" path="/var/lib/kubelet/pods/f8e3be6e-0ab2-414e-98ca-9c2ba29f485d/volumes" Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.891898 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerStarted","Data":"35e9f26aa4e995677cef3022a22a657b76e7e90d09c9d964b9672882b0e5c3fe"} Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.902655 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d45f5d76-ptzqb" event={"ID":"6cc29833-0849-46ec-bc06-1c980ec2dc02","Type":"ContainerStarted","Data":"8d1b16e1acd1c762c96f4336a1b2e0fe3c9565e87b61cc369a2f08c7251ad440"} Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.902712 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d45f5d76-ptzqb" event={"ID":"6cc29833-0849-46ec-bc06-1c980ec2dc02","Type":"ContainerStarted","Data":"bdce724a2081dc5cfcf9b684a95e664e162db35660a93874eb0d34c899d7c0ca"} Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.902773 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:02:41 crc kubenswrapper[4696]: I1202 23:02:41.935097 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76d45f5d76-ptzqb" podStartSLOduration=2.935078942 podStartE2EDuration="2.935078942s" podCreationTimestamp="2025-12-02 23:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:41.929457983 +0000 UTC m=+1224.810138004" watchObservedRunningTime="2025-12-02 23:02:41.935078942 +0000 UTC m=+1224.815758943" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.040888 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.041072 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.286693 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.415844 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.576208 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c784657c6-hdbrw" podUID="b414fc10-9d51-456b-aaa9-d6b4dd08af99" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.163:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.163:8443: connect: connection refused" Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.835294 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.837113 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" containerID="cri-o://44a9e254b6c1dfc6ef34619e2a1d3179f8a6c9b5842f68596e4be12198f93f5c" gracePeriod=30 Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.837544 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" containerID="cri-o://454e5848a04308a2a3ae9f54efdaabd6b80f1ac9cbecc24ca0b89cdd25107675" gracePeriod=30 Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.924870 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerStarted","Data":"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a"} Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.927084 4696 generic.go:334] "Generic (PLEG): container finished" podID="f066d064-95ba-42b3-ba9f-5e859533c93c" containerID="171777d49f633d478ff4c9cd39f56f6babed8df8c66ecb007794fd2897eff333" exitCode=0 Dec 02 23:02:42 crc kubenswrapper[4696]: I1202 23:02:42.927422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fnjtl" event={"ID":"f066d064-95ba-42b3-ba9f-5e859533c93c","Type":"ContainerDied","Data":"171777d49f633d478ff4c9cd39f56f6babed8df8c66ecb007794fd2897eff333"} Dec 02 23:02:43 crc kubenswrapper[4696]: I1202 23:02:43.949972 4696 generic.go:334] "Generic (PLEG): container finished" podID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerID="454e5848a04308a2a3ae9f54efdaabd6b80f1ac9cbecc24ca0b89cdd25107675" exitCode=143 Dec 02 23:02:43 crc kubenswrapper[4696]: I1202 23:02:43.950067 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerDied","Data":"454e5848a04308a2a3ae9f54efdaabd6b80f1ac9cbecc24ca0b89cdd25107675"} Dec 02 23:02:43 crc kubenswrapper[4696]: I1202 23:02:43.955128 4696 generic.go:334] "Generic (PLEG): container finished" podID="d1f5ea7d-03ba-43ab-8863-9547b016bb0a" containerID="35ee60b6b7516dbeb7b9afc433010c0563b80e80aa7e5ad5dfe21a670e0e4290" exitCode=0 Dec 02 23:02:43 crc kubenswrapper[4696]: I1202 23:02:43.955228 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9cwtk" event={"ID":"d1f5ea7d-03ba-43ab-8863-9547b016bb0a","Type":"ContainerDied","Data":"35ee60b6b7516dbeb7b9afc433010c0563b80e80aa7e5ad5dfe21a670e0e4290"} Dec 02 23:02:43 crc kubenswrapper[4696]: I1202 23:02:43.967844 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerStarted","Data":"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3"} Dec 02 23:02:44 crc kubenswrapper[4696]: I1202 23:02:44.451913 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.451892766 podStartE2EDuration="4.451892766s" podCreationTimestamp="2025-12-02 23:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:44.009938293 +0000 UTC m=+1226.890618294" watchObservedRunningTime="2025-12-02 23:02:44.451892766 +0000 UTC m=+1227.332572767" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.227726 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fnjtl" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.348218 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data\") pod \"f066d064-95ba-42b3-ba9f-5e859533c93c\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.348356 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mknpj\" (UniqueName: \"kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj\") pod \"f066d064-95ba-42b3-ba9f-5e859533c93c\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.348430 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts\") pod \"f066d064-95ba-42b3-ba9f-5e859533c93c\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.348458 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle\") pod \"f066d064-95ba-42b3-ba9f-5e859533c93c\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.348559 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs\") pod \"f066d064-95ba-42b3-ba9f-5e859533c93c\" (UID: \"f066d064-95ba-42b3-ba9f-5e859533c93c\") " Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.349511 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs" (OuterVolumeSpecName: "logs") pod "f066d064-95ba-42b3-ba9f-5e859533c93c" (UID: "f066d064-95ba-42b3-ba9f-5e859533c93c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.355288 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts" (OuterVolumeSpecName: "scripts") pod "f066d064-95ba-42b3-ba9f-5e859533c93c" (UID: "f066d064-95ba-42b3-ba9f-5e859533c93c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.358863 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj" (OuterVolumeSpecName: "kube-api-access-mknpj") pod "f066d064-95ba-42b3-ba9f-5e859533c93c" (UID: "f066d064-95ba-42b3-ba9f-5e859533c93c"). InnerVolumeSpecName "kube-api-access-mknpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.382918 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f066d064-95ba-42b3-ba9f-5e859533c93c" (UID: "f066d064-95ba-42b3-ba9f-5e859533c93c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.383954 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data" (OuterVolumeSpecName: "config-data") pod "f066d064-95ba-42b3-ba9f-5e859533c93c" (UID: "f066d064-95ba-42b3-ba9f-5e859533c93c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.451763 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.451807 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f066d064-95ba-42b3-ba9f-5e859533c93c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.451818 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.451830 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mknpj\" (UniqueName: \"kubernetes.io/projected/f066d064-95ba-42b3-ba9f-5e859533c93c-kube-api-access-mknpj\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.451839 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f066d064-95ba-42b3-ba9f-5e859533c93c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.987025 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9322/\": read tcp 10.217.0.2:49588->10.217.0.166:9322: read: connection reset by peer" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.987122 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.166:9322/\": read tcp 10.217.0.2:49598->10.217.0.166:9322: read: connection reset by peer" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.994806 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fnjtl" event={"ID":"f066d064-95ba-42b3-ba9f-5e859533c93c","Type":"ContainerDied","Data":"f1133c98fbeb59fe4866f96c42a5e18ee0687c6efd4b0a00350eaebdacbecca1"} Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.994850 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1133c98fbeb59fe4866f96c42a5e18ee0687c6efd4b0a00350eaebdacbecca1" Dec 02 23:02:45 crc kubenswrapper[4696]: I1202 23:02:45.994914 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fnjtl" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.345284 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-599746d6dd-mg2dx"] Dec 02 23:02:46 crc kubenswrapper[4696]: E1202 23:02:46.345769 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" containerName="placement-db-sync" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.345783 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" containerName="placement-db-sync" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.345995 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" containerName="placement-db-sync" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.347059 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.349152 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.353412 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.353491 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.355082 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f9524" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.356946 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.370391 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599746d6dd-mg2dx"] Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.475602 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-combined-ca-bundle\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.475847 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-internal-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.475980 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-public-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.476039 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-scripts\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.476373 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-config-data\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.476417 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprcc\" (UniqueName: \"kubernetes.io/projected/fe44184e-95f9-4a2e-a6a4-e2534c44e933-kube-api-access-mprcc\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.476576 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe44184e-95f9-4a2e-a6a4-e2534c44e933-logs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578644 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-config-data\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578716 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprcc\" (UniqueName: \"kubernetes.io/projected/fe44184e-95f9-4a2e-a6a4-e2534c44e933-kube-api-access-mprcc\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578809 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe44184e-95f9-4a2e-a6a4-e2534c44e933-logs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578858 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-combined-ca-bundle\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578897 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-internal-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578942 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-public-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.578962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-scripts\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.581393 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe44184e-95f9-4a2e-a6a4-e2534c44e933-logs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.587361 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-config-data\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.589325 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-internal-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.589395 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-public-tls-certs\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.590767 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-combined-ca-bundle\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.595131 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe44184e-95f9-4a2e-a6a4-e2534c44e933-scripts\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.608001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprcc\" (UniqueName: \"kubernetes.io/projected/fe44184e-95f9-4a2e-a6a4-e2534c44e933-kube-api-access-mprcc\") pod \"placement-599746d6dd-mg2dx\" (UID: \"fe44184e-95f9-4a2e-a6a4-e2534c44e933\") " pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:46 crc kubenswrapper[4696]: I1202 23:02:46.693035 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:48 crc kubenswrapper[4696]: I1202 23:02:48.737496 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.166:9322/\": dial tcp 10.217.0.166:9322: connect: connection refused" Dec 02 23:02:48 crc kubenswrapper[4696]: I1202 23:02:48.737496 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9322/\": dial tcp 10.217.0.166:9322: connect: connection refused" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.037777 4696 generic.go:334] "Generic (PLEG): container finished" podID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerID="44a9e254b6c1dfc6ef34619e2a1d3179f8a6c9b5842f68596e4be12198f93f5c" exitCode=0 Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.037860 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerDied","Data":"44a9e254b6c1dfc6ef34619e2a1d3179f8a6c9b5842f68596e4be12198f93f5c"} Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.042809 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9cwtk" event={"ID":"d1f5ea7d-03ba-43ab-8863-9547b016bb0a","Type":"ContainerDied","Data":"e2ff65e5d4d34dde4de40e7d6677d2e6113a135e4da6ec6243cf8d5fb522781a"} Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.042839 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ff65e5d4d34dde4de40e7d6677d2e6113a135e4da6ec6243cf8d5fb522781a" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.085657 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.244700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mvz5\" (UniqueName: \"kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5\") pod \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.245030 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle\") pod \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.245089 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config\") pod \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\" (UID: \"d1f5ea7d-03ba-43ab-8863-9547b016bb0a\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.268340 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5" (OuterVolumeSpecName: "kube-api-access-6mvz5") pod "d1f5ea7d-03ba-43ab-8863-9547b016bb0a" (UID: "d1f5ea7d-03ba-43ab-8863-9547b016bb0a"). InnerVolumeSpecName "kube-api-access-6mvz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.293697 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config" (OuterVolumeSpecName: "config") pod "d1f5ea7d-03ba-43ab-8863-9547b016bb0a" (UID: "d1f5ea7d-03ba-43ab-8863-9547b016bb0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.306579 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1f5ea7d-03ba-43ab-8863-9547b016bb0a" (UID: "d1f5ea7d-03ba-43ab-8863-9547b016bb0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.347791 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.347826 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.347845 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mvz5\" (UniqueName: \"kubernetes.io/projected/d1f5ea7d-03ba-43ab-8863-9547b016bb0a-kube-api-access-6mvz5\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.371404 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.572452 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data\") pod \"22eeafc0-fbad-4836-9c47-0f3e01784d91\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.572509 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca\") pod \"22eeafc0-fbad-4836-9c47-0f3e01784d91\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.572798 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bntgf\" (UniqueName: \"kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf\") pod \"22eeafc0-fbad-4836-9c47-0f3e01784d91\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.572870 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle\") pod \"22eeafc0-fbad-4836-9c47-0f3e01784d91\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.573044 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs\") pod \"22eeafc0-fbad-4836-9c47-0f3e01784d91\" (UID: \"22eeafc0-fbad-4836-9c47-0f3e01784d91\") " Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.575545 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs" (OuterVolumeSpecName: "logs") pod "22eeafc0-fbad-4836-9c47-0f3e01784d91" (UID: "22eeafc0-fbad-4836-9c47-0f3e01784d91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.587113 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf" (OuterVolumeSpecName: "kube-api-access-bntgf") pod "22eeafc0-fbad-4836-9c47-0f3e01784d91" (UID: "22eeafc0-fbad-4836-9c47-0f3e01784d91"). InnerVolumeSpecName "kube-api-access-bntgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.607190 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "22eeafc0-fbad-4836-9c47-0f3e01784d91" (UID: "22eeafc0-fbad-4836-9c47-0f3e01784d91"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.614674 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22eeafc0-fbad-4836-9c47-0f3e01784d91" (UID: "22eeafc0-fbad-4836-9c47-0f3e01784d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.639983 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data" (OuterVolumeSpecName: "config-data") pod "22eeafc0-fbad-4836-9c47-0f3e01784d91" (UID: "22eeafc0-fbad-4836-9c47-0f3e01784d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.677403 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bntgf\" (UniqueName: \"kubernetes.io/projected/22eeafc0-fbad-4836-9c47-0f3e01784d91-kube-api-access-bntgf\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.677454 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.677469 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22eeafc0-fbad-4836-9c47-0f3e01784d91-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.677482 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.677496 4696 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/22eeafc0-fbad-4836-9c47-0f3e01784d91-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:49 crc kubenswrapper[4696]: I1202 23:02:49.679156 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599746d6dd-mg2dx"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.058630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b2zfs" event={"ID":"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143","Type":"ContainerStarted","Data":"13b6ba914c44051cdda92cf0f314254cff50d93b66d42259c8ace70da7096401"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.071189 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerStarted","Data":"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.091035 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"22eeafc0-fbad-4836-9c47-0f3e01784d91","Type":"ContainerDied","Data":"1be4358aab9c8bb3c3a293040846df0feedb346988decf0e6c80e02e16342a37"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.091157 4696 scope.go:117] "RemoveContainer" containerID="44a9e254b6c1dfc6ef34619e2a1d3179f8a6c9b5842f68596e4be12198f93f5c" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.091407 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.095077 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b2zfs" podStartSLOduration=5.782128294 podStartE2EDuration="59.095058606s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="2025-12-02 23:01:55.928947598 +0000 UTC m=+1178.809627599" lastFinishedPulling="2025-12-02 23:02:49.24187791 +0000 UTC m=+1232.122557911" observedRunningTime="2025-12-02 23:02:50.086841863 +0000 UTC m=+1232.967521864" watchObservedRunningTime="2025-12-02 23:02:50.095058606 +0000 UTC m=+1232.975738607" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.106446 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnh2k" event={"ID":"359eca54-19ad-4e8d-b580-29a37d8f38c8","Type":"ContainerStarted","Data":"dffc07df9b78ef87e3c8ee7f5071ef38e9dd2d9b298f7b929cd9b62013f4afab"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.124706 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9cwtk" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.125658 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599746d6dd-mg2dx" event={"ID":"fe44184e-95f9-4a2e-a6a4-e2534c44e933","Type":"ContainerStarted","Data":"80b0a881d76da23262a8b38dafcb9bd75781a3b9d290051b8e21c8e1eadb66c2"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.125765 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599746d6dd-mg2dx" event={"ID":"fe44184e-95f9-4a2e-a6a4-e2534c44e933","Type":"ContainerStarted","Data":"b542747042b2306422797f23c1159734a2a7e388de5dde2b48b19658e364481c"} Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.152983 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hnh2k" podStartSLOduration=6.165958459 podStartE2EDuration="59.152950899s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="2025-12-02 23:01:55.929033361 +0000 UTC m=+1178.809713362" lastFinishedPulling="2025-12-02 23:02:48.916025801 +0000 UTC m=+1231.796705802" observedRunningTime="2025-12-02 23:02:50.140548977 +0000 UTC m=+1233.021228978" watchObservedRunningTime="2025-12-02 23:02:50.152950899 +0000 UTC m=+1233.033630900" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.154472 4696 scope.go:117] "RemoveContainer" containerID="454e5848a04308a2a3ae9f54efdaabd6b80f1ac9cbecc24ca0b89cdd25107675" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.201592 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.223270 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.248764 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:50 crc kubenswrapper[4696]: E1202 23:02:50.249310 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f5ea7d-03ba-43ab-8863-9547b016bb0a" containerName="neutron-db-sync" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249334 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f5ea7d-03ba-43ab-8863-9547b016bb0a" containerName="neutron-db-sync" Dec 02 23:02:50 crc kubenswrapper[4696]: E1202 23:02:50.249361 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249368 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" Dec 02 23:02:50 crc kubenswrapper[4696]: E1202 23:02:50.249384 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249391 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249669 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api-log" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249686 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f5ea7d-03ba-43ab-8863-9547b016bb0a" containerName="neutron-db-sync" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.249696 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" containerName="watcher-api" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.251008 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.257604 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.258066 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.262123 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.268630 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.359666 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.363075 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.391396 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.398392 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.398690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxw5\" (UniqueName: \"kubernetes.io/projected/3ae28daa-ab18-478f-ac27-6be4b2d632d3-kube-api-access-xsxw5\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.398861 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.398936 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-config-data\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.399019 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.399117 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae28daa-ab18-478f-ac27-6be4b2d632d3-logs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.399181 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.498172 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500049 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500708 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500782 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pb8c\" (UniqueName: \"kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500820 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500918 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-config-data\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.500966 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501026 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501142 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae28daa-ab18-478f-ac27-6be4b2d632d3-logs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501224 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501375 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501471 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxw5\" (UniqueName: \"kubernetes.io/projected/3ae28daa-ab18-478f-ac27-6be4b2d632d3-kube-api-access-xsxw5\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.501573 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.503468 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.503757 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ft4bp" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.503891 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.504042 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.508906 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae28daa-ab18-478f-ac27-6be4b2d632d3-logs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.515312 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.519618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.519864 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.519897 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-config-data\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.522960 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxw5\" (UniqueName: \"kubernetes.io/projected/3ae28daa-ab18-478f-ac27-6be4b2d632d3-kube-api-access-xsxw5\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.527483 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ae28daa-ab18-478f-ac27-6be4b2d632d3-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3ae28daa-ab18-478f-ac27-6be4b2d632d3\") " pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.530972 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.598254 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604395 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pb8c\" (UniqueName: \"kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604619 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604663 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604719 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604783 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7jx\" (UniqueName: \"kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604868 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604901 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.604979 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.606439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.607569 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.607904 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.608212 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.608459 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.622342 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pb8c\" (UniqueName: \"kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c\") pod \"dnsmasq-dns-6b7b667979-7g8jh\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.662481 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.662551 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.686268 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.706890 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.707347 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7jx\" (UniqueName: \"kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.707385 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.707431 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.707512 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.713923 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.714811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.717435 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.721296 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.729800 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7jx\" (UniqueName: \"kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.735472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config\") pod \"neutron-97757dbdd-59bbj\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.742100 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:02:50 crc kubenswrapper[4696]: I1202 23:02:50.920348 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.219495 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599746d6dd-mg2dx" event={"ID":"fe44184e-95f9-4a2e-a6a4-e2534c44e933","Type":"ContainerStarted","Data":"6ef29987635521a0e68ec43b512e27031172c6a0de5c4ff28425d8ffe4429bd3"} Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.220887 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.220956 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.234048 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.234455 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.258641 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.272615 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-599746d6dd-mg2dx" podStartSLOduration=5.272589027 podStartE2EDuration="5.272589027s" podCreationTimestamp="2025-12-02 23:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:51.263150399 +0000 UTC m=+1234.143830400" watchObservedRunningTime="2025-12-02 23:02:51.272589027 +0000 UTC m=+1234.153269028" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.378835 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:02:51 crc kubenswrapper[4696]: W1202 23:02:51.414187 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f60e24_013d_4957_b1c9_537e9fd42efe.slice/crio-1b8651c4b2995c6b8addf07b5003071dce2f5a58d86dc078f382e14a151a077a WatchSource:0}: Error finding container 1b8651c4b2995c6b8addf07b5003071dce2f5a58d86dc078f382e14a151a077a: Status 404 returned error can't find the container with id 1b8651c4b2995c6b8addf07b5003071dce2f5a58d86dc078f382e14a151a077a Dec 02 23:02:51 crc kubenswrapper[4696]: W1202 23:02:51.418863 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ae28daa_ab18_478f_ac27_6be4b2d632d3.slice/crio-940ac1e8d7d4aefb09d7234ee21c10b4d5915904580a46dfc43ea875b93ca4a6 WatchSource:0}: Error finding container 940ac1e8d7d4aefb09d7234ee21c10b4d5915904580a46dfc43ea875b93ca4a6: Status 404 returned error can't find the container with id 940ac1e8d7d4aefb09d7234ee21c10b4d5915904580a46dfc43ea875b93ca4a6 Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.481075 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22eeafc0-fbad-4836-9c47-0f3e01784d91" path="/var/lib/kubelet/pods/22eeafc0-fbad-4836-9c47-0f3e01784d91/volumes" Dec 02 23:02:51 crc kubenswrapper[4696]: I1202 23:02:51.963759 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.259317 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.268783 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerStarted","Data":"5f5a15cb3050ce6bda3d8a9233fdaaabadff59759dd9cf7b5fda7740281e293c"} Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.291657 4696 generic.go:334] "Generic (PLEG): container finished" podID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerID="dafed8e7edfc81bf61b6a5367a567abda1ffc5e333d6a1657b536927d727873d" exitCode=0 Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.292303 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" event={"ID":"45f60e24-013d-4957-b1c9-537e9fd42efe","Type":"ContainerDied","Data":"dafed8e7edfc81bf61b6a5367a567abda1ffc5e333d6a1657b536927d727873d"} Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.292345 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" event={"ID":"45f60e24-013d-4957-b1c9-537e9fd42efe","Type":"ContainerStarted","Data":"1b8651c4b2995c6b8addf07b5003071dce2f5a58d86dc078f382e14a151a077a"} Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.316676 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3ae28daa-ab18-478f-ac27-6be4b2d632d3","Type":"ContainerStarted","Data":"07f396be75aa9d714dd3bc9439b558c0502733fb64aa52bf23041e57b16bc0e6"} Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.316731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3ae28daa-ab18-478f-ac27-6be4b2d632d3","Type":"ContainerStarted","Data":"940ac1e8d7d4aefb09d7234ee21c10b4d5915904580a46dfc43ea875b93ca4a6"} Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.567700 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c784657c6-hdbrw" podUID="b414fc10-9d51-456b-aaa9-d6b4dd08af99" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.163:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.163:8443: connect: connection refused" Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.973468 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.973948 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.986090 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67dbcf9bdf-2hr7m"] Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.988180 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.993395 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 23:02:52 crc kubenswrapper[4696]: I1202 23:02:52.993570 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.012768 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67dbcf9bdf-2hr7m"] Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.114659 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-httpd-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.114723 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-internal-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.114815 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-ovndb-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.115132 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-combined-ca-bundle\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.115277 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.115308 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-public-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.115564 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54t7s\" (UniqueName: \"kubernetes.io/projected/797ad679-555c-4599-bc0c-21c0254a3a5a-kube-api-access-54t7s\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218088 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54t7s\" (UniqueName: \"kubernetes.io/projected/797ad679-555c-4599-bc0c-21c0254a3a5a-kube-api-access-54t7s\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218167 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-httpd-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218189 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-internal-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218223 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-ovndb-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-combined-ca-bundle\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218353 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.218371 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-public-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.230489 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-httpd-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.231078 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-internal-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.231712 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-combined-ca-bundle\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.234603 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-public-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.244390 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-ovndb-tls-certs\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.257660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/797ad679-555c-4599-bc0c-21c0254a3a5a-config\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.257819 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54t7s\" (UniqueName: \"kubernetes.io/projected/797ad679-555c-4599-bc0c-21c0254a3a5a-kube-api-access-54t7s\") pod \"neutron-67dbcf9bdf-2hr7m\" (UID: \"797ad679-555c-4599-bc0c-21c0254a3a5a\") " pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.348143 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.349912 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerStarted","Data":"d6d5cbdfb9eb49771c7911e8f5aea5c7f87ec56679cfdb4bad9be2184b1f616d"} Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.349966 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerStarted","Data":"60ac838900bf957583ef488ecefb176051ebf222807e7c4ff7aa7356a437b6fb"} Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.350909 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.361994 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" event={"ID":"45f60e24-013d-4957-b1c9-537e9fd42efe","Type":"ContainerStarted","Data":"d096b3b0632b18a47b4c8018be56a953a4f7cdfd855a6069b6410f4a45a0bb50"} Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.363660 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.366728 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.366765 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.368068 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3ae28daa-ab18-478f-ac27-6be4b2d632d3","Type":"ContainerStarted","Data":"db9a75a5131a707370103409187d0bbaf0864cc2932780061375627a8fa6680e"} Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.368095 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.388446 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-97757dbdd-59bbj" podStartSLOduration=3.388420311 podStartE2EDuration="3.388420311s" podCreationTimestamp="2025-12-02 23:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:53.376095001 +0000 UTC m=+1236.256775002" watchObservedRunningTime="2025-12-02 23:02:53.388420311 +0000 UTC m=+1236.269100312" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.409894 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" podStartSLOduration=3.40987019 podStartE2EDuration="3.40987019s" podCreationTimestamp="2025-12-02 23:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:53.409114038 +0000 UTC m=+1236.289794039" watchObservedRunningTime="2025-12-02 23:02:53.40987019 +0000 UTC m=+1236.290550191" Dec 02 23:02:53 crc kubenswrapper[4696]: I1202 23:02:53.465949 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.465927331 podStartE2EDuration="3.465927331s" podCreationTimestamp="2025-12-02 23:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:53.442394003 +0000 UTC m=+1236.323074014" watchObservedRunningTime="2025-12-02 23:02:53.465927331 +0000 UTC m=+1236.346607332" Dec 02 23:02:54 crc kubenswrapper[4696]: I1202 23:02:54.067266 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67dbcf9bdf-2hr7m"] Dec 02 23:02:54 crc kubenswrapper[4696]: I1202 23:02:54.275911 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:02:54 crc kubenswrapper[4696]: I1202 23:02:54.385024 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67dbcf9bdf-2hr7m" event={"ID":"797ad679-555c-4599-bc0c-21c0254a3a5a","Type":"ContainerStarted","Data":"0328bb78be41d736c4ec3808850fed13987be6b549f1fa479060d4a6cacaf445"} Dec 02 23:02:54 crc kubenswrapper[4696]: I1202 23:02:54.385528 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:54 crc kubenswrapper[4696]: I1202 23:02:54.434654 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.403908 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67dbcf9bdf-2hr7m" event={"ID":"797ad679-555c-4599-bc0c-21c0254a3a5a","Type":"ContainerStarted","Data":"f682533baf1194dac2a9758aa98d26b91f2ed49aaa5f4b557048025ad90e71ae"} Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.404512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67dbcf9bdf-2hr7m" event={"ID":"797ad679-555c-4599-bc0c-21c0254a3a5a","Type":"ContainerStarted","Data":"72a38648ca523be84723f3310ea897be23b8b0e2f8581698f68d4a04a2d357ac"} Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.404549 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.431354 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67dbcf9bdf-2hr7m" podStartSLOduration=3.431320554 podStartE2EDuration="3.431320554s" podCreationTimestamp="2025-12-02 23:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:55.42731919 +0000 UTC m=+1238.307999191" watchObservedRunningTime="2025-12-02 23:02:55.431320554 +0000 UTC m=+1238.312000565" Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.601874 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:02:55 crc kubenswrapper[4696]: I1202 23:02:55.602436 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:02:56 crc kubenswrapper[4696]: I1202 23:02:56.054515 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:02:56 crc kubenswrapper[4696]: I1202 23:02:56.415885 4696 generic.go:334] "Generic (PLEG): container finished" podID="359eca54-19ad-4e8d-b580-29a37d8f38c8" containerID="dffc07df9b78ef87e3c8ee7f5071ef38e9dd2d9b298f7b929cd9b62013f4afab" exitCode=0 Dec 02 23:02:56 crc kubenswrapper[4696]: I1202 23:02:56.416085 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnh2k" event={"ID":"359eca54-19ad-4e8d-b580-29a37d8f38c8","Type":"ContainerDied","Data":"dffc07df9b78ef87e3c8ee7f5071ef38e9dd2d9b298f7b929cd9b62013f4afab"} Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.792485 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.905481 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle\") pod \"359eca54-19ad-4e8d-b580-29a37d8f38c8\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.905717 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26tll\" (UniqueName: \"kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll\") pod \"359eca54-19ad-4e8d-b580-29a37d8f38c8\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.905843 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data\") pod \"359eca54-19ad-4e8d-b580-29a37d8f38c8\" (UID: \"359eca54-19ad-4e8d-b580-29a37d8f38c8\") " Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.915102 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll" (OuterVolumeSpecName: "kube-api-access-26tll") pod "359eca54-19ad-4e8d-b580-29a37d8f38c8" (UID: "359eca54-19ad-4e8d-b580-29a37d8f38c8"). InnerVolumeSpecName "kube-api-access-26tll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.915638 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "359eca54-19ad-4e8d-b580-29a37d8f38c8" (UID: "359eca54-19ad-4e8d-b580-29a37d8f38c8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:02:59 crc kubenswrapper[4696]: I1202 23:02:59.940653 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "359eca54-19ad-4e8d-b580-29a37d8f38c8" (UID: "359eca54-19ad-4e8d-b580-29a37d8f38c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.007923 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26tll\" (UniqueName: \"kubernetes.io/projected/359eca54-19ad-4e8d-b580-29a37d8f38c8-kube-api-access-26tll\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.007969 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.007980 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359eca54-19ad-4e8d-b580-29a37d8f38c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.464400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnh2k" event={"ID":"359eca54-19ad-4e8d-b580-29a37d8f38c8","Type":"ContainerDied","Data":"5d5bdda3021bc7f6c80fb39da47a76fc40a3b74b05462efbb48eae842d2f2071"} Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.464820 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5bdda3021bc7f6c80fb39da47a76fc40a3b74b05462efbb48eae842d2f2071" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.464549 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnh2k" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.599687 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.610854 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.689073 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.789820 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:03:00 crc kubenswrapper[4696]: I1202 23:03:00.790123 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="dnsmasq-dns" containerID="cri-o://984a87197dd761ce11008eeb165b37ee156f5be8fbe75aa87c71e67ad095363b" gracePeriod=10 Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.148953 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-95b48849f-64t8k"] Dec 02 23:03:01 crc kubenswrapper[4696]: E1202 23:03:01.149919 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" containerName="barbican-db-sync" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.149936 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" containerName="barbican-db-sync" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.150140 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" containerName="barbican-db-sync" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.151374 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.154643 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.155009 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.155125 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2w9bc" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.175899 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d6f88f57d-fkfhk"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.178274 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.183549 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.202107 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d6f88f57d-fkfhk"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.226847 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-95b48849f-64t8k"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.259945 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-combined-ca-bundle\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.260014 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q9h8\" (UniqueName: \"kubernetes.io/projected/314505e6-5f55-4c07-9692-c5698c6e3ff1-kube-api-access-4q9h8\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.260175 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data-custom\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.260328 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.260396 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314505e6-5f55-4c07-9692-c5698c6e3ff1-logs\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.329829 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.331589 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.336842 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363668 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314505e6-5f55-4c07-9692-c5698c6e3ff1-logs\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363825 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363874 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data-custom\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-combined-ca-bundle\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363932 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q9h8\" (UniqueName: \"kubernetes.io/projected/314505e6-5f55-4c07-9692-c5698c6e3ff1-kube-api-access-4q9h8\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363968 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh556\" (UniqueName: \"kubernetes.io/projected/6a2e48f4-820d-4199-883c-f7d93f5f12c6-kube-api-access-dh556\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.363996 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.364027 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data-custom\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.364054 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2e48f4-820d-4199-883c-f7d93f5f12c6-logs\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.374326 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.374633 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314505e6-5f55-4c07-9692-c5698c6e3ff1-logs\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.381187 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.391541 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-combined-ca-bundle\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.413372 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.416492 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.417399 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q9h8\" (UniqueName: \"kubernetes.io/projected/314505e6-5f55-4c07-9692-c5698c6e3ff1-kube-api-access-4q9h8\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.430237 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314505e6-5f55-4c07-9692-c5698c6e3ff1-config-data-custom\") pod \"barbican-worker-95b48849f-64t8k\" (UID: \"314505e6-5f55-4c07-9692-c5698c6e3ff1\") " pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474152 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474210 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474257 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474278 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data-custom\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474304 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474324 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474355 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474390 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh556\" (UniqueName: \"kubernetes.io/projected/6a2e48f4-820d-4199-883c-f7d93f5f12c6-kube-api-access-dh556\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474416 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474452 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2e48f4-820d-4199-883c-f7d93f5f12c6-logs\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.474479 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h956d\" (UniqueName: \"kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.493778 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.506840 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.513243 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data-custom\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.515898 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-95b48849f-64t8k" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.520611 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2e48f4-820d-4199-883c-f7d93f5f12c6-logs\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.530270 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh556\" (UniqueName: \"kubernetes.io/projected/6a2e48f4-820d-4199-883c-f7d93f5f12c6-kube-api-access-dh556\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.535872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2e48f4-820d-4199-883c-f7d93f5f12c6-config-data\") pod \"barbican-keystone-listener-7d6f88f57d-fkfhk\" (UID: \"6a2e48f4-820d-4199-883c-f7d93f5f12c6\") " pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.571069 4696 generic.go:334] "Generic (PLEG): container finished" podID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerID="984a87197dd761ce11008eeb165b37ee156f5be8fbe75aa87c71e67ad095363b" exitCode=0 Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.573005 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" event={"ID":"cbc9aaae-ff97-4d62-806d-6823b2cc6da8","Type":"ContainerDied","Data":"984a87197dd761ce11008eeb165b37ee156f5be8fbe75aa87c71e67ad095363b"} Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576220 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpngr\" (UniqueName: \"kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576303 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h956d\" (UniqueName: \"kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576403 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576427 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576453 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576517 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576553 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576574 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576613 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.576637 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.577889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.577963 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.587782 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.589514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.594113 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.623875 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h956d\" (UniqueName: \"kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d\") pod \"dnsmasq-dns-848cf88cfc-lhwst\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.662426 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.684321 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.684384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.684409 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.684500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.684533 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpngr\" (UniqueName: \"kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.687681 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.700950 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.701562 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.701671 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.702221 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.730285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpngr\" (UniqueName: \"kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr\") pod \"barbican-api-7f8795dd98-pn9n4\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.818639 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.838649 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.838791 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994012 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95t7t\" (UniqueName: \"kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994196 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994298 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994329 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994364 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:01 crc kubenswrapper[4696]: I1202 23:03:01.994414 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config\") pod \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\" (UID: \"cbc9aaae-ff97-4d62-806d-6823b2cc6da8\") " Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.012141 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t" (OuterVolumeSpecName: "kube-api-access-95t7t") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "kube-api-access-95t7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.099968 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95t7t\" (UniqueName: \"kubernetes.io/projected/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-kube-api-access-95t7t\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.122207 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.130442 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.150511 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config" (OuterVolumeSpecName: "config") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.158548 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.207732 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.207820 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.207832 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.207843 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.210587 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cbc9aaae-ff97-4d62-806d-6823b2cc6da8" (UID: "cbc9aaae-ff97-4d62-806d-6823b2cc6da8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.309964 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc9aaae-ff97-4d62-806d-6823b2cc6da8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.428014 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-95b48849f-64t8k"] Dec 02 23:03:02 crc kubenswrapper[4696]: W1202 23:03:02.434532 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod314505e6_5f55_4c07_9692_c5698c6e3ff1.slice/crio-d499eac0f19cfe3f674ecc9989f89c5174b410df77fe69965abaf4bce2bc506c WatchSource:0}: Error finding container d499eac0f19cfe3f674ecc9989f89c5174b410df77fe69965abaf4bce2bc506c: Status 404 returned error can't find the container with id d499eac0f19cfe3f674ecc9989f89c5174b410df77fe69965abaf4bce2bc506c Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.586036 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" event={"ID":"cbc9aaae-ff97-4d62-806d-6823b2cc6da8","Type":"ContainerDied","Data":"3061dc201acc3c3c79b62772aa0b829fe5d3344f64967b419e3592ef1290561c"} Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.586116 4696 scope.go:117] "RemoveContainer" containerID="984a87197dd761ce11008eeb165b37ee156f5be8fbe75aa87c71e67ad095363b" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.586046 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-qdkb4" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.594223 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" containerID="13b6ba914c44051cdda92cf0f314254cff50d93b66d42259c8ace70da7096401" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.594296 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b2zfs" event={"ID":"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143","Type":"ContainerDied","Data":"13b6ba914c44051cdda92cf0f314254cff50d93b66d42259c8ace70da7096401"} Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.596326 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95b48849f-64t8k" event={"ID":"314505e6-5f55-4c07-9692-c5698c6e3ff1","Type":"ContainerStarted","Data":"d499eac0f19cfe3f674ecc9989f89c5174b410df77fe69965abaf4bce2bc506c"} Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603684 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-central-agent" containerID="cri-o://ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerStarted","Data":"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59"} Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603871 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="sg-core" containerID="cri-o://4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603897 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603888 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="proxy-httpd" containerID="cri-o://2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.603945 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-notification-agent" containerID="cri-o://72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.647335 4696 scope.go:117] "RemoveContainer" containerID="0a2b179520dbce686f4792963bffd2d611593f11a805aa18c553c02cd483dfd0" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.656938 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.694048986 podStartE2EDuration="1m11.656917646s" podCreationTimestamp="2025-12-02 23:01:51 +0000 UTC" firstStartedPulling="2025-12-02 23:01:55.899012049 +0000 UTC m=+1178.779692050" lastFinishedPulling="2025-12-02 23:03:00.861880709 +0000 UTC m=+1243.742560710" observedRunningTime="2025-12-02 23:03:02.640516151 +0000 UTC m=+1245.521196152" watchObservedRunningTime="2025-12-02 23:03:02.656917646 +0000 UTC m=+1245.537597647" Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.657143 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.683605 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.697347 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-qdkb4"] Dec 02 23:03:02 crc kubenswrapper[4696]: W1202 23:03:02.789415 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a2e48f4_820d_4199_883c_f7d93f5f12c6.slice/crio-2c380898a63108d944a6b05e2053b41e54768e4064a48259441bc59f1ca4497d WatchSource:0}: Error finding container 2c380898a63108d944a6b05e2053b41e54768e4064a48259441bc59f1ca4497d: Status 404 returned error can't find the container with id 2c380898a63108d944a6b05e2053b41e54768e4064a48259441bc59f1ca4497d Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.821245 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d6f88f57d-fkfhk"] Dec 02 23:03:02 crc kubenswrapper[4696]: I1202 23:03:02.855551 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.444906 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" path="/var/lib/kubelet/pods/cbc9aaae-ff97-4d62-806d-6823b2cc6da8/volumes" Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.618900 4696 generic.go:334] "Generic (PLEG): container finished" podID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerID="76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd" exitCode=0 Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.618984 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" event={"ID":"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1","Type":"ContainerDied","Data":"76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.619025 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" event={"ID":"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1","Type":"ContainerStarted","Data":"e8b7a89a64af0dce25dba005777f3de0f54223f2d758645839531b7267019dbc"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.625149 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerStarted","Data":"76fe1411adece83854009d383d32c2892877828d13f137e8d83fbf17082809e4"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.625226 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerStarted","Data":"5808956f2468d868426eebb3d36fb11961e1d0742b940753b107c5cd877fb67c"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.625240 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerStarted","Data":"ffe626ddd5ba4bd0146debc6551375d48a40363b07620e239e512ee12c072a56"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.626252 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.626288 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631133 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerDied","Data":"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631319 4696 generic.go:334] "Generic (PLEG): container finished" podID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerID="2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59" exitCode=0 Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631461 4696 generic.go:334] "Generic (PLEG): container finished" podID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerID="4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3" exitCode=2 Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631527 4696 generic.go:334] "Generic (PLEG): container finished" podID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerID="ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05" exitCode=0 Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631596 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerDied","Data":"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.631640 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerDied","Data":"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.634928 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" event={"ID":"6a2e48f4-820d-4199-883c-f7d93f5f12c6","Type":"ContainerStarted","Data":"2c380898a63108d944a6b05e2053b41e54768e4064a48259441bc59f1ca4497d"} Dec 02 23:03:03 crc kubenswrapper[4696]: I1202 23:03:03.676626 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f8795dd98-pn9n4" podStartSLOduration=2.676599528 podStartE2EDuration="2.676599528s" podCreationTimestamp="2025-12-02 23:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:03.672121101 +0000 UTC m=+1246.552801102" watchObservedRunningTime="2025-12-02 23:03:03.676599528 +0000 UTC m=+1246.557279529" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.364965 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.478631 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.478704 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5nlg\" (UniqueName: \"kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.478825 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.478873 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.479056 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.479047 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.479131 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts\") pod \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\" (UID: \"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143\") " Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.479615 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.484017 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg" (OuterVolumeSpecName: "kube-api-access-z5nlg") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "kube-api-access-z5nlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.485176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.487116 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts" (OuterVolumeSpecName: "scripts") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.511107 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.571557 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data" (OuterVolumeSpecName: "config-data") pod "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" (UID: "b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.584724 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.584781 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5nlg\" (UniqueName: \"kubernetes.io/projected/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-kube-api-access-z5nlg\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.584791 4696 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.584801 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.584810 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.662803 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b2zfs" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.663100 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b2zfs" event={"ID":"b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143","Type":"ContainerDied","Data":"4ca326dcd7da11fa6df645954d4aa9f5a75e9be9c5c684269c406c4e8cefdddd"} Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.663172 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca326dcd7da11fa6df645954d4aa9f5a75e9be9c5c684269c406c4e8cefdddd" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.683687 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.819847 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:04 crc kubenswrapper[4696]: E1202 23:03:04.820553 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="dnsmasq-dns" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.820578 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="dnsmasq-dns" Dec 02 23:03:04 crc kubenswrapper[4696]: E1202 23:03:04.820609 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="init" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.820620 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="init" Dec 02 23:03:04 crc kubenswrapper[4696]: E1202 23:03:04.820652 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" containerName="cinder-db-sync" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.820661 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" containerName="cinder-db-sync" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.820995 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" containerName="cinder-db-sync" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.821029 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc9aaae-ff97-4d62-806d-6823b2cc6da8" containerName="dnsmasq-dns" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.822672 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.830659 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.831161 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.831228 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ml2m5" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.831666 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.832016 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.915964 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.957393 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.963096 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.970493 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.980075 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996293 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996462 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tt5g\" (UniqueName: \"kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996509 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:04 crc kubenswrapper[4696]: I1202 23:03:04.996536 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.098996 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099106 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tt5g\" (UniqueName: \"kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099163 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099186 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rm2k\" (UniqueName: \"kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099219 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099276 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099292 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099368 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099427 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.099459 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.107191 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.117660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.122930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.123907 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.124293 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.136734 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tt5g\" (UniqueName: \"kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g\") pod \"cinder-scheduler-0\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.143496 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202295 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202363 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202398 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202459 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rm2k\" (UniqueName: \"kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202510 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.202557 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.204285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.204781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.204860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.204975 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.205727 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.211370 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.214033 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.217120 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.227921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rm2k\" (UniqueName: \"kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k\") pod \"dnsmasq-dns-6578955fd5-9m9jd\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.232769 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.318830 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.318991 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.319225 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pdw\" (UniqueName: \"kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.319319 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.319399 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.319439 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.319472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.332192 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.369403 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-748c8487f8-gqxg9"] Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.375397 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.391422 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.391617 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.397310 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-748c8487f8-gqxg9"] Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421030 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421080 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421128 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421185 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421262 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pdw\" (UniqueName: \"kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421307 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.421340 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.430983 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.431120 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.435292 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.436595 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.442453 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.451273 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.471852 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pdw\" (UniqueName: \"kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw\") pod \"cinder-api-0\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524031 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data-custom\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524441 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c68604e-222e-4a20-b829-c2f4e3c6923e-logs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524515 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-combined-ca-bundle\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524562 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-public-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524633 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-internal-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524660 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g66jq\" (UniqueName: \"kubernetes.io/projected/4c68604e-222e-4a20-b829-c2f4e3c6923e-kube-api-access-g66jq\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.524690 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.603721 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627216 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data-custom\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627278 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c68604e-222e-4a20-b829-c2f4e3c6923e-logs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627323 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-combined-ca-bundle\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627365 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-public-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627415 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-internal-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g66jq\" (UniqueName: \"kubernetes.io/projected/4c68604e-222e-4a20-b829-c2f4e3c6923e-kube-api-access-g66jq\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.627489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.629481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c68604e-222e-4a20-b829-c2f4e3c6923e-logs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.637055 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-combined-ca-bundle\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.642576 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-public-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.643146 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data-custom\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.648297 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-config-data\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.649218 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c68604e-222e-4a20-b829-c2f4e3c6923e-internal-tls-certs\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.654331 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g66jq\" (UniqueName: \"kubernetes.io/projected/4c68604e-222e-4a20-b829-c2f4e3c6923e-kube-api-access-g66jq\") pod \"barbican-api-748c8487f8-gqxg9\" (UID: \"4c68604e-222e-4a20-b829-c2f4e3c6923e\") " pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:05 crc kubenswrapper[4696]: I1202 23:03:05.797462 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.004987 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.265589 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.619181 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-748c8487f8-gqxg9"] Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.697237 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.732077 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-748c8487f8-gqxg9" event={"ID":"4c68604e-222e-4a20-b829-c2f4e3c6923e","Type":"ContainerStarted","Data":"07633fd8c1f864d229b160597739bce6c81ae5c4fcb4af4c02cde71a5613c762"} Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.733238 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerStarted","Data":"062edb233c0c07fd1b83c75d55340df53205e72d5e80f61bed85c3f5751b57dd"} Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.737351 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" event={"ID":"61f3e755-0ed7-4e18-aa16-11e0ebc89957","Type":"ContainerStarted","Data":"8ac3e712e59d18648562672a4f91ae51c75e5a20916f0449a3bd9ff26ee29bf0"} Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.742171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" event={"ID":"6a2e48f4-820d-4199-883c-f7d93f5f12c6","Type":"ContainerStarted","Data":"a33bda1763f8932a92aa608f8e06819219bca7e57f2526e54fe67b26a3d28fa4"} Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.747504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerStarted","Data":"74b11490e2cfd3269bd5a126ab330cf100dae8dd9843804c32867851439a8a4c"} Dec 02 23:03:06 crc kubenswrapper[4696]: I1202 23:03:06.755137 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" event={"ID":"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1","Type":"ContainerStarted","Data":"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.757644 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.805681 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" event={"ID":"6a2e48f4-820d-4199-883c-f7d93f5f12c6","Type":"ContainerStarted","Data":"c45ea1458a55b7fa4a35da9a1be0c430cad65472d655bfe2b7d2eb508f73ccbf"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.828778 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c784657c6-hdbrw" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.832388 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95b48849f-64t8k" event={"ID":"314505e6-5f55-4c07-9692-c5698c6e3ff1","Type":"ContainerStarted","Data":"84e8bc1d322ddddc69cb0e841d3844d0d4bff5e4ae4a1a122fcabb28ba03197e"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.832446 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-95b48849f-64t8k" event={"ID":"314505e6-5f55-4c07-9692-c5698c6e3ff1","Type":"ContainerStarted","Data":"237e6f867c0e59ced5b43b54b2873c162a5707ec307acba9551ccf8e0b96715a"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.851566 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-748c8487f8-gqxg9" event={"ID":"4c68604e-222e-4a20-b829-c2f4e3c6923e","Type":"ContainerStarted","Data":"bd67d859e7531b4049c8c5c8054fd948a159f6bf9cffdb67b6c5cf53aff143ac"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.851999 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-748c8487f8-gqxg9" event={"ID":"4c68604e-222e-4a20-b829-c2f4e3c6923e","Type":"ContainerStarted","Data":"cb3660b0359802b7b42d138b5ad329a1d28bb4459e0ef240a5d7a7af07e5cc7e"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.852037 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.852061 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.853947 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d6f88f57d-fkfhk" podStartSLOduration=4.182465868 podStartE2EDuration="6.853924733s" podCreationTimestamp="2025-12-02 23:03:01 +0000 UTC" firstStartedPulling="2025-12-02 23:03:02.795939632 +0000 UTC m=+1245.676619633" lastFinishedPulling="2025-12-02 23:03:05.467398497 +0000 UTC m=+1248.348078498" observedRunningTime="2025-12-02 23:03:07.841042117 +0000 UTC m=+1250.721722118" watchObservedRunningTime="2025-12-02 23:03:07.853924733 +0000 UTC m=+1250.734604734" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.870073 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerStarted","Data":"84eb3da8d9726989cf23638271ecf20b06b0c5ee0ddc8acf16c0cc25deccfa4b"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.902077 4696 generic.go:334] "Generic (PLEG): container finished" podID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerID="28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2" exitCode=0 Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.902264 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="dnsmasq-dns" containerID="cri-o://d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3" gracePeriod=10 Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.903595 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" event={"ID":"61f3e755-0ed7-4e18-aa16-11e0ebc89957","Type":"ContainerDied","Data":"28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2"} Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.903676 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:07 crc kubenswrapper[4696]: I1202 23:03:07.936645 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-95b48849f-64t8k" podStartSLOduration=3.907111833 podStartE2EDuration="6.93661608s" podCreationTimestamp="2025-12-02 23:03:01 +0000 UTC" firstStartedPulling="2025-12-02 23:03:02.437936551 +0000 UTC m=+1245.318616552" lastFinishedPulling="2025-12-02 23:03:05.467440798 +0000 UTC m=+1248.348120799" observedRunningTime="2025-12-02 23:03:07.905710932 +0000 UTC m=+1250.786390933" watchObservedRunningTime="2025-12-02 23:03:07.93661608 +0000 UTC m=+1250.817296081" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.051814 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.083363 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-748c8487f8-gqxg9" podStartSLOduration=3.083338354 podStartE2EDuration="3.083338354s" podCreationTimestamp="2025-12-02 23:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:07.94718245 +0000 UTC m=+1250.827862441" watchObservedRunningTime="2025-12-02 23:03:08.083338354 +0000 UTC m=+1250.964018355" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.118940 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.133186 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" podStartSLOduration=7.133166248 podStartE2EDuration="7.133166248s" podCreationTimestamp="2025-12-02 23:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:08.042783933 +0000 UTC m=+1250.923463944" watchObservedRunningTime="2025-12-02 23:03:08.133166248 +0000 UTC m=+1251.013846249" Dec 02 23:03:08 crc kubenswrapper[4696]: E1202 23:03:08.469926 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0621ebe8_bd33_4ec5_a7fa_e2f0378445b1.slice/crio-conmon-d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3b9bc17_d5a1_45c6_83e3_ac1f7417a3ad.slice/crio-72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3b9bc17_d5a1_45c6_83e3_ac1f7417a3ad.slice/crio-conmon-72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.726015 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.872544 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h956d\" (UniqueName: \"kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.873106 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.873148 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.873208 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.873380 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.873455 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc\") pod \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\" (UID: \"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.892158 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d" (OuterVolumeSpecName: "kube-api-access-h956d") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "kube-api-access-h956d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.893245 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.943860 4696 generic.go:334] "Generic (PLEG): container finished" podID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerID="72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8" exitCode=0 Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.944020 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerDied","Data":"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8"} Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.944067 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad","Type":"ContainerDied","Data":"fac2ed5656a4a994857c132d054c6384082ff81fc904f65653d08b1a8f1bc54c"} Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.944091 4696 scope.go:117] "RemoveContainer" containerID="2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.944303 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.967939 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerStarted","Data":"310dfbca22cf7acbe78369c9d15f3b0816fd2bbde6d845cfdc16889f994d00a0"} Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.968307 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api-log" containerID="cri-o://84eb3da8d9726989cf23638271ecf20b06b0c5ee0ddc8acf16c0cc25deccfa4b" gracePeriod=30 Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.968416 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api" containerID="cri-o://310dfbca22cf7acbe78369c9d15f3b0816fd2bbde6d845cfdc16889f994d00a0" gracePeriod=30 Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.968725 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978031 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978299 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978348 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978384 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978689 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rjl5\" (UniqueName: \"kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.978730 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.979804 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.981276 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml\") pod \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\" (UID: \"c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad\") " Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.982286 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" event={"ID":"61f3e755-0ed7-4e18-aa16-11e0ebc89957","Type":"ContainerStarted","Data":"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1"} Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.982361 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.984471 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.986122 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.986146 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h956d\" (UniqueName: \"kubernetes.io/projected/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-kube-api-access-h956d\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.986158 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:08 crc kubenswrapper[4696]: I1202 23:03:08.992674 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerStarted","Data":"e058b93f2782ea4b482a16769e49147966e6b05f9a938c6eb9c84dcf5881976d"} Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.002415 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts" (OuterVolumeSpecName: "scripts") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.016733 4696 generic.go:334] "Generic (PLEG): container finished" podID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerID="d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3" exitCode=0 Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.018065 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon-log" containerID="cri-o://db7c41afc2c4141b03c10301aee7cd5a2b37a58c34385a505278ac8b3e2f3bf7" gracePeriod=30 Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.018212 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" containerID="cri-o://d15abc85ab3659d4083f5707c2acb2b3368eb8c46e42748f7c5227e701ce835e" gracePeriod=30 Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.016988 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" event={"ID":"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1","Type":"ContainerDied","Data":"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3"} Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.018509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" event={"ID":"0621ebe8-bd33-4ec5-a7fa-e2f0378445b1","Type":"ContainerDied","Data":"e8b7a89a64af0dce25dba005777f3de0f54223f2d758645839531b7267019dbc"} Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.016962 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-lhwst" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.026661 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5" (OuterVolumeSpecName: "kube-api-access-2rjl5") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "kube-api-access-2rjl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.045973 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.054784 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.054730075 podStartE2EDuration="4.054730075s" podCreationTimestamp="2025-12-02 23:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:09.002514013 +0000 UTC m=+1251.883194014" watchObservedRunningTime="2025-12-02 23:03:09.054730075 +0000 UTC m=+1251.935410076" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.066428 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.070050 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" podStartSLOduration=5.070024869 podStartE2EDuration="5.070024869s" podCreationTimestamp="2025-12-02 23:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:09.044077293 +0000 UTC m=+1251.924757294" watchObservedRunningTime="2025-12-02 23:03:09.070024869 +0000 UTC m=+1251.950704870" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.070494 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.089492 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config" (OuterVolumeSpecName: "config") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.089990 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.090010 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rjl5\" (UniqueName: \"kubernetes.io/projected/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-kube-api-access-2rjl5\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.090101 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.090131 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.090147 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.094179 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" (UID: "0621ebe8-bd33-4ec5-a7fa-e2f0378445b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.107765 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.176737 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.193421 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.193705 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.193820 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.193892 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.241866 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data" (OuterVolumeSpecName: "config-data") pod "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" (UID: "c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.285359 4696 scope.go:117] "RemoveContainer" containerID="4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.298362 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.337845 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.361987 4696 scope.go:117] "RemoveContainer" containerID="72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.371501 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384042 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384730 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="sg-core" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384769 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="sg-core" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384800 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="proxy-httpd" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384809 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="proxy-httpd" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384826 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="init" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384859 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="init" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384878 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-central-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384886 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-central-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384903 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-notification-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384913 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-notification-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.384941 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="dnsmasq-dns" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.384948 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="dnsmasq-dns" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.385202 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-central-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.385220 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" containerName="dnsmasq-dns" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.385246 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="ceilometer-notification-agent" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.385268 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="proxy-httpd" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.385280 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" containerName="sg-core" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.388861 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.392358 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.392605 4696 scope.go:117] "RemoveContainer" containerID="ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.393116 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.393517 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.410256 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-lhwst"] Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.423204 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.430006 4696 scope.go:117] "RemoveContainer" containerID="2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.432959 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59\": container with ID starting with 2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59 not found: ID does not exist" containerID="2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.433012 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59"} err="failed to get container status \"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59\": rpc error: code = NotFound desc = could not find container \"2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59\": container with ID starting with 2b36491f7352bd945b90566e164559bd33ad96b6ad834115b7b26aea0cb6fe59 not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.433049 4696 scope.go:117] "RemoveContainer" containerID="4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.436921 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3\": container with ID starting with 4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3 not found: ID does not exist" containerID="4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.436952 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3"} err="failed to get container status \"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3\": rpc error: code = NotFound desc = could not find container \"4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3\": container with ID starting with 4f185263cdb98259c95dc767a780fa658e7f3450756723a579db215acb58cef3 not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.436972 4696 scope.go:117] "RemoveContainer" containerID="72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.439261 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8\": container with ID starting with 72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8 not found: ID does not exist" containerID="72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.439338 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8"} err="failed to get container status \"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8\": rpc error: code = NotFound desc = could not find container \"72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8\": container with ID starting with 72aa2ec66948e9e46625b8ea6bbe256cb366995dae9271c36a4a554dc505a5b8 not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.439393 4696 scope.go:117] "RemoveContainer" containerID="ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.442207 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05\": container with ID starting with ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05 not found: ID does not exist" containerID="ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.442243 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05"} err="failed to get container status \"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05\": rpc error: code = NotFound desc = could not find container \"ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05\": container with ID starting with ca9ab61cbfd53bbb03bd1cc23ce0638e4939961f79c7460c61be3f19dae28b05 not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.442260 4696 scope.go:117] "RemoveContainer" containerID="d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.456400 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0621ebe8-bd33-4ec5-a7fa-e2f0378445b1" path="/var/lib/kubelet/pods/0621ebe8-bd33-4ec5-a7fa-e2f0378445b1/volumes" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.461961 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad" path="/var/lib/kubelet/pods/c3b9bc17-d5a1-45c6-83e3-ac1f7417a3ad/volumes" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.492714 4696 scope.go:117] "RemoveContainer" containerID="76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504703 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504758 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504789 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504849 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504930 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.504985 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.505013 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.522284 4696 scope.go:117] "RemoveContainer" containerID="d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.525211 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3\": container with ID starting with d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3 not found: ID does not exist" containerID="d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.525575 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3"} err="failed to get container status \"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3\": rpc error: code = NotFound desc = could not find container \"d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3\": container with ID starting with d4f971a6419259408f20d638bcf57239e144c38d9574a1b26b7a8bbd12dc82f3 not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.525667 4696 scope.go:117] "RemoveContainer" containerID="76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd" Dec 02 23:03:09 crc kubenswrapper[4696]: E1202 23:03:09.526011 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd\": container with ID starting with 76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd not found: ID does not exist" containerID="76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.526042 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd"} err="failed to get container status \"76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd\": rpc error: code = NotFound desc = could not find container \"76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd\": container with ID starting with 76fc8ae718b2879950480d5cb6e1327194973dd8b6db0fb00b9eb2d27a4d42cd not found: ID does not exist" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607299 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607399 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607431 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607472 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607494 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607514 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.607552 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.609062 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.609263 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.614582 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.615161 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.615504 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.618540 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.628256 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p\") pod \"ceilometer-0\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " pod="openstack/ceilometer-0" Dec 02 23:03:09 crc kubenswrapper[4696]: I1202 23:03:09.738309 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.037242 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerStarted","Data":"7618743797a549f7d6b5daec7d244c0950b5e97e9811429ba9af5651b14f8839"} Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.047106 4696 generic.go:334] "Generic (PLEG): container finished" podID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerID="84eb3da8d9726989cf23638271ecf20b06b0c5ee0ddc8acf16c0cc25deccfa4b" exitCode=143 Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.047296 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerDied","Data":"84eb3da8d9726989cf23638271ecf20b06b0c5ee0ddc8acf16c0cc25deccfa4b"} Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.062375 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.692402171 podStartE2EDuration="6.062354264s" podCreationTimestamp="2025-12-02 23:03:04 +0000 UTC" firstStartedPulling="2025-12-02 23:03:06.035911772 +0000 UTC m=+1248.916591773" lastFinishedPulling="2025-12-02 23:03:07.405863865 +0000 UTC m=+1250.286543866" observedRunningTime="2025-12-02 23:03:10.056295662 +0000 UTC m=+1252.936975663" watchObservedRunningTime="2025-12-02 23:03:10.062354264 +0000 UTC m=+1252.943034265" Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.144256 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 23:03:10 crc kubenswrapper[4696]: I1202 23:03:10.208037 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:11 crc kubenswrapper[4696]: I1202 23:03:11.066023 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerStarted","Data":"491b661aa31e58772cb121b7938accf2914b85231a58835d3396d9ff42692ec6"} Dec 02 23:03:12 crc kubenswrapper[4696]: I1202 23:03:12.260337 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Dec 02 23:03:12 crc kubenswrapper[4696]: I1202 23:03:12.864523 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76d45f5d76-ptzqb" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.094291 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerStarted","Data":"eedf30edd100f822a25c5db9c6414d00352ced217607db5662ea488255d326d8"} Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.096534 4696 generic.go:334] "Generic (PLEG): container finished" podID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerID="d15abc85ab3659d4083f5707c2acb2b3368eb8c46e42748f7c5227e701ce835e" exitCode=0 Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.096558 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerDied","Data":"d15abc85ab3659d4083f5707c2acb2b3368eb8c46e42748f7c5227e701ce835e"} Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.159827 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.161617 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.163772 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.165076 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.165308 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bw9h9" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.170581 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.293416 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.293472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.293952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77m7\" (UniqueName: \"kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.294440 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.396984 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.397061 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.397081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.397157 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77m7\" (UniqueName: \"kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.398157 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.408287 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.409274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.413462 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.426853 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77m7\" (UniqueName: \"kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7\") pod \"openstackclient\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.490918 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.494646 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.644809 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.661620 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.705188 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.706718 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.729849 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.807471 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.807598 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.807669 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.807699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrnj\" (UniqueName: \"kubernetes.io/projected/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-kube-api-access-fxrnj\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.909138 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.909204 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrnj\" (UniqueName: \"kubernetes.io/projected/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-kube-api-access-fxrnj\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.909284 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.909350 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.910334 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.916335 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.916503 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: I1202 23:03:13.931720 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrnj\" (UniqueName: \"kubernetes.io/projected/3e0a050b-c652-4ef2-8f1a-19c8f4732a0c-kube-api-access-fxrnj\") pod \"openstackclient\" (UID: \"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c\") " pod="openstack/openstackclient" Dec 02 23:03:13 crc kubenswrapper[4696]: E1202 23:03:13.979288 4696 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 02 23:03:13 crc kubenswrapper[4696]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_bce2a48e-57d0-496b-9127-10bb64c5c48b_0(9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5" Netns:"/var/run/netns/dc2f1664-0779-4d00-b82e-dd6b4e923da0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5;K8S_POD_UID=bce2a48e-57d0-496b-9127-10bb64c5c48b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/bce2a48e-57d0-496b-9127-10bb64c5c48b]: expected pod UID "bce2a48e-57d0-496b-9127-10bb64c5c48b" but got "3e0a050b-c652-4ef2-8f1a-19c8f4732a0c" from Kube API Dec 02 23:03:13 crc kubenswrapper[4696]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 23:03:13 crc kubenswrapper[4696]: > Dec 02 23:03:13 crc kubenswrapper[4696]: E1202 23:03:13.979404 4696 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 02 23:03:13 crc kubenswrapper[4696]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_bce2a48e-57d0-496b-9127-10bb64c5c48b_0(9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5" Netns:"/var/run/netns/dc2f1664-0779-4d00-b82e-dd6b4e923da0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=9147d383df28c7d3cf3359ec0489051999f337fd85f016b13deafdf5a3d8fae5;K8S_POD_UID=bce2a48e-57d0-496b-9127-10bb64c5c48b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/bce2a48e-57d0-496b-9127-10bb64c5c48b]: expected pod UID "bce2a48e-57d0-496b-9127-10bb64c5c48b" but got "3e0a050b-c652-4ef2-8f1a-19c8f4732a0c" from Kube API Dec 02 23:03:13 crc kubenswrapper[4696]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 23:03:13 crc kubenswrapper[4696]: > pod="openstack/openstackclient" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.053093 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.110848 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.125288 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bce2a48e-57d0-496b-9127-10bb64c5c48b" podUID="3e0a050b-c652-4ef2-8f1a-19c8f4732a0c" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.132154 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.216768 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config\") pod \"bce2a48e-57d0-496b-9127-10bb64c5c48b\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.216838 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle\") pod \"bce2a48e-57d0-496b-9127-10bb64c5c48b\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.216972 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret\") pod \"bce2a48e-57d0-496b-9127-10bb64c5c48b\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.217073 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t77m7\" (UniqueName: \"kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7\") pod \"bce2a48e-57d0-496b-9127-10bb64c5c48b\" (UID: \"bce2a48e-57d0-496b-9127-10bb64c5c48b\") " Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.217527 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bce2a48e-57d0-496b-9127-10bb64c5c48b" (UID: "bce2a48e-57d0-496b-9127-10bb64c5c48b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.217893 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.229426 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7" (OuterVolumeSpecName: "kube-api-access-t77m7") pod "bce2a48e-57d0-496b-9127-10bb64c5c48b" (UID: "bce2a48e-57d0-496b-9127-10bb64c5c48b"). InnerVolumeSpecName "kube-api-access-t77m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.230895 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce2a48e-57d0-496b-9127-10bb64c5c48b" (UID: "bce2a48e-57d0-496b-9127-10bb64c5c48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.231027 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bce2a48e-57d0-496b-9127-10bb64c5c48b" (UID: "bce2a48e-57d0-496b-9127-10bb64c5c48b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.321185 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.321231 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t77m7\" (UniqueName: \"kubernetes.io/projected/bce2a48e-57d0-496b-9127-10bb64c5c48b-kube-api-access-t77m7\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.321242 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce2a48e-57d0-496b-9127-10bb64c5c48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:14 crc kubenswrapper[4696]: I1202 23:03:14.558973 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 23:03:14 crc kubenswrapper[4696]: W1202 23:03:14.560919 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e0a050b_c652_4ef2_8f1a_19c8f4732a0c.slice/crio-9337803928ef5759494410fb477933080dd264e92a0d67a4495a8d53ea676bf3 WatchSource:0}: Error finding container 9337803928ef5759494410fb477933080dd264e92a0d67a4495a8d53ea676bf3: Status 404 returned error can't find the container with id 9337803928ef5759494410fb477933080dd264e92a0d67a4495a8d53ea676bf3 Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.122383 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerStarted","Data":"57355eed1547b03775dcae1050e96bfe01be9a37a75ae3f193337bc37344c75b"} Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.123600 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c","Type":"ContainerStarted","Data":"9337803928ef5759494410fb477933080dd264e92a0d67a4495a8d53ea676bf3"} Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.123640 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.137830 4696 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bce2a48e-57d0-496b-9127-10bb64c5c48b" podUID="3e0a050b-c652-4ef2-8f1a-19c8f4732a0c" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.337171 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.415006 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.415302 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="dnsmasq-dns" containerID="cri-o://d096b3b0632b18a47b4c8018be56a953a4f7cdfd855a6069b6410f4a45a0bb50" gracePeriod=10 Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.534080 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce2a48e-57d0-496b-9127-10bb64c5c48b" path="/var/lib/kubelet/pods/bce2a48e-57d0-496b-9127-10bb64c5c48b/volumes" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.552028 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.692353 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: connect: connection refused" Dec 02 23:03:15 crc kubenswrapper[4696]: I1202 23:03:15.697217 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.150630 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerStarted","Data":"c16413dcbb0a7708424b716c40c5fdd25b814a6800d0d3ca9c029a003b17da46"} Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.155391 4696 generic.go:334] "Generic (PLEG): container finished" podID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerID="d096b3b0632b18a47b4c8018be56a953a4f7cdfd855a6069b6410f4a45a0bb50" exitCode=0 Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.155721 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="cinder-scheduler" containerID="cri-o://e058b93f2782ea4b482a16769e49147966e6b05f9a938c6eb9c84dcf5881976d" gracePeriod=30 Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.155941 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" event={"ID":"45f60e24-013d-4957-b1c9-537e9fd42efe","Type":"ContainerDied","Data":"d096b3b0632b18a47b4c8018be56a953a4f7cdfd855a6069b6410f4a45a0bb50"} Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.156561 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="probe" containerID="cri-o://7618743797a549f7d6b5daec7d244c0950b5e97e9811429ba9af5651b14f8839" gracePeriod=30 Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.309182 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381444 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381526 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381826 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pb8c\" (UniqueName: \"kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.381872 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb\") pod \"45f60e24-013d-4957-b1c9-537e9fd42efe\" (UID: \"45f60e24-013d-4957-b1c9-537e9fd42efe\") " Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.405849 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c" (OuterVolumeSpecName: "kube-api-access-4pb8c") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "kube-api-access-4pb8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.482550 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.485107 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pb8c\" (UniqueName: \"kubernetes.io/projected/45f60e24-013d-4957-b1c9-537e9fd42efe-kube-api-access-4pb8c\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.485138 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.493446 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.508503 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config" (OuterVolumeSpecName: "config") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.514880 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.535514 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45f60e24-013d-4957-b1c9-537e9fd42efe" (UID: "45f60e24-013d-4957-b1c9-537e9fd42efe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.587174 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.587225 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.587240 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:16 crc kubenswrapper[4696]: I1202 23:03:16.587249 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f60e24-013d-4957-b1c9-537e9fd42efe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.192449 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" event={"ID":"45f60e24-013d-4957-b1c9-537e9fd42efe","Type":"ContainerDied","Data":"1b8651c4b2995c6b8addf07b5003071dce2f5a58d86dc078f382e14a151a077a"} Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.192515 4696 scope.go:117] "RemoveContainer" containerID="d096b3b0632b18a47b4c8018be56a953a4f7cdfd855a6069b6410f4a45a0bb50" Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.197902 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-7g8jh" Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.275792 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.284888 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-7g8jh"] Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.297242 4696 scope.go:117] "RemoveContainer" containerID="dafed8e7edfc81bf61b6a5367a567abda1ffc5e333d6a1657b536927d727873d" Dec 02 23:03:17 crc kubenswrapper[4696]: I1202 23:03:17.456114 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" path="/var/lib/kubelet/pods/45f60e24-013d-4957-b1c9-537e9fd42efe/volumes" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.209220 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerStarted","Data":"804daa95a3aad3872c09836efc0fb67463a6bfdbf3e36ed78514da9e876e450e"} Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.209683 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.216093 4696 generic.go:334] "Generic (PLEG): container finished" podID="e00a6503-3712-48a7-b008-3aa85beb4445" containerID="7618743797a549f7d6b5daec7d244c0950b5e97e9811429ba9af5651b14f8839" exitCode=0 Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.216129 4696 generic.go:334] "Generic (PLEG): container finished" podID="e00a6503-3712-48a7-b008-3aa85beb4445" containerID="e058b93f2782ea4b482a16769e49147966e6b05f9a938c6eb9c84dcf5881976d" exitCode=0 Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.216194 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerDied","Data":"7618743797a549f7d6b5daec7d244c0950b5e97e9811429ba9af5651b14f8839"} Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.216227 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerDied","Data":"e058b93f2782ea4b482a16769e49147966e6b05f9a938c6eb9c84dcf5881976d"} Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.255194 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.225882268 podStartE2EDuration="9.255163729s" podCreationTimestamp="2025-12-02 23:03:09 +0000 UTC" firstStartedPulling="2025-12-02 23:03:10.228074787 +0000 UTC m=+1253.108754788" lastFinishedPulling="2025-12-02 23:03:17.257356258 +0000 UTC m=+1260.138036249" observedRunningTime="2025-12-02 23:03:18.243446206 +0000 UTC m=+1261.124126207" watchObservedRunningTime="2025-12-02 23:03:18.255163729 +0000 UTC m=+1261.135843730" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.630024 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.640144 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.786949 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.787090 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.787184 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.787209 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.787236 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.787291 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tt5g\" (UniqueName: \"kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g\") pod \"e00a6503-3712-48a7-b008-3aa85beb4445\" (UID: \"e00a6503-3712-48a7-b008-3aa85beb4445\") " Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.788842 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.790957 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-748c8487f8-gqxg9" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.802461 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.806155 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts" (OuterVolumeSpecName: "scripts") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.808931 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g" (OuterVolumeSpecName: "kube-api-access-6tt5g") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "kube-api-access-6tt5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.889347 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e00a6503-3712-48a7-b008-3aa85beb4445-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.889388 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.889402 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tt5g\" (UniqueName: \"kubernetes.io/projected/e00a6503-3712-48a7-b008-3aa85beb4445-kube-api-access-6tt5g\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.889413 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.902014 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.902281 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8795dd98-pn9n4" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" containerID="cri-o://5808956f2468d868426eebb3d36fb11961e1d0742b940753b107c5cd877fb67c" gracePeriod=30 Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.902458 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8795dd98-pn9n4" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api" containerID="cri-o://76fe1411adece83854009d383d32c2892877828d13f137e8d83fbf17082809e4" gracePeriod=30 Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.941246 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8795dd98-pn9n4" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.994003 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:18 crc kubenswrapper[4696]: I1202 23:03:18.995091 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.049853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data" (OuterVolumeSpecName: "config-data") pod "e00a6503-3712-48a7-b008-3aa85beb4445" (UID: "e00a6503-3712-48a7-b008-3aa85beb4445"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.064835 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.099709 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00a6503-3712-48a7-b008-3aa85beb4445-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.123154 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599746d6dd-mg2dx" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.269437 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.269501 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e00a6503-3712-48a7-b008-3aa85beb4445","Type":"ContainerDied","Data":"74b11490e2cfd3269bd5a126ab330cf100dae8dd9843804c32867851439a8a4c"} Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.269584 4696 scope.go:117] "RemoveContainer" containerID="7618743797a549f7d6b5daec7d244c0950b5e97e9811429ba9af5651b14f8839" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.273421 4696 generic.go:334] "Generic (PLEG): container finished" podID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerID="5808956f2468d868426eebb3d36fb11961e1d0742b940753b107c5cd877fb67c" exitCode=143 Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.277756 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerDied","Data":"5808956f2468d868426eebb3d36fb11961e1d0742b940753b107c5cd877fb67c"} Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.312306 4696 scope.go:117] "RemoveContainer" containerID="e058b93f2782ea4b482a16769e49147966e6b05f9a938c6eb9c84dcf5881976d" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.315927 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.326876 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.342971 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:19 crc kubenswrapper[4696]: E1202 23:03:19.343412 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="init" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343428 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="init" Dec 02 23:03:19 crc kubenswrapper[4696]: E1202 23:03:19.343439 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="probe" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343445 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="probe" Dec 02 23:03:19 crc kubenswrapper[4696]: E1202 23:03:19.343481 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="cinder-scheduler" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343488 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="cinder-scheduler" Dec 02 23:03:19 crc kubenswrapper[4696]: E1202 23:03:19.343503 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="dnsmasq-dns" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343508 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="dnsmasq-dns" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343687 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="probe" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343708 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f60e24-013d-4957-b1c9-537e9fd42efe" containerName="dnsmasq-dns" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.343727 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" containerName="cinder-scheduler" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.344866 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.348456 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.378709 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.453448 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00a6503-3712-48a7-b008-3aa85beb4445" path="/var/lib/kubelet/pods/e00a6503-3712-48a7-b008-3aa85beb4445/volumes" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.513571 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.513655 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.513726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-scripts\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.515006 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.515196 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.516180 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mss\" (UniqueName: \"kubernetes.io/projected/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-kube-api-access-56mss\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.582229 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619317 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619372 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619415 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-scripts\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619441 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.619594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56mss\" (UniqueName: \"kubernetes.io/projected/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-kube-api-access-56mss\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.624704 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.642131 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.645047 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-scripts\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.645984 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.647827 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56mss\" (UniqueName: \"kubernetes.io/projected/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-kube-api-access-56mss\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.648122 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a7a7a1-8e9c-4b77-8a09-783b8b465cf5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5\") " pod="openstack/cinder-scheduler-0" Dec 02 23:03:19 crc kubenswrapper[4696]: I1202 23:03:19.681219 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:03:20 crc kubenswrapper[4696]: I1202 23:03:20.279323 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:03:20 crc kubenswrapper[4696]: I1202 23:03:20.312395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5","Type":"ContainerStarted","Data":"a82368026de52d68e298dc64a0110d3a8b6d9d39a22ed06bc11e44b577b116d1"} Dec 02 23:03:20 crc kubenswrapper[4696]: I1202 23:03:20.929923 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:03:21 crc kubenswrapper[4696]: I1202 23:03:21.378579 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5","Type":"ContainerStarted","Data":"0bee08efe3636364209948261badc5be9aef3b5567d742095c34c4177f313fd8"} Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.130447 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tkf2x"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.136140 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.151816 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tkf2x"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.241343 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tc9xp"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.243156 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.258138 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.272435 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4f32-account-create-update-fg4wn"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.274031 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.281734 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.299914 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tc9xp"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.306400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gp2q\" (UniqueName: \"kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.307156 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.329628 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4f32-account-create-update-fg4wn"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.397068 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-svgf5"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.398579 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.408990 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gp2q\" (UniqueName: \"kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.409075 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.409129 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjv9d\" (UniqueName: \"kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.409156 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.409177 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5zv\" (UniqueName: \"kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.409217 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.414444 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.433956 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17a7a7a1-8e9c-4b77-8a09-783b8b465cf5","Type":"ContainerStarted","Data":"bb3e5f5f806d94a0166b92ebcb0d1fe0e4241aa379bc9cf0e0a2604675004461"} Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.460817 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-svgf5"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.491719 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gp2q\" (UniqueName: \"kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q\") pod \"nova-api-db-create-tkf2x\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.519601 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.519976 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjv9d\" (UniqueName: \"kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.520087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5zv\" (UniqueName: \"kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.521815 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw8zk\" (UniqueName: \"kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.521975 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.522101 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.521853 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.522853 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.539862 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-eff7-account-create-update-7ks4n"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.541421 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.546500 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5zv\" (UniqueName: \"kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv\") pod \"nova-api-4f32-account-create-update-fg4wn\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.549232 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.550973 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.550948525 podStartE2EDuration="3.550948525s" podCreationTimestamp="2025-12-02 23:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:22.469328419 +0000 UTC m=+1265.350008410" watchObservedRunningTime="2025-12-02 23:03:22.550948525 +0000 UTC m=+1265.431628526" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.551516 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjv9d\" (UniqueName: \"kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d\") pod \"nova-cell0-db-create-tc9xp\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.584108 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.602843 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eff7-account-create-update-7ks4n"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.626249 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.627529 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw8zk\" (UniqueName: \"kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.629862 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.634445 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.677093 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw8zk\" (UniqueName: \"kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk\") pod \"nova-cell1-db-create-svgf5\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.720514 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5964f98dd9-7q2kj"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.729618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9h6\" (UniqueName: \"kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.729764 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.753946 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.761083 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.767464 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.773851 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.785872 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5964f98dd9-7q2kj"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.787919 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.797178 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.841299 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-internal-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.842665 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.842859 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-config-data\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.842962 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-combined-ca-bundle\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843036 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-log-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843135 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-public-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843264 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-etc-swift\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843282 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-run-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843406 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6w6\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-kube-api-access-rj6w6\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.843512 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9h6\" (UniqueName: \"kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.847456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.885579 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9h6\" (UniqueName: \"kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6\") pod \"nova-cell0-eff7-account-create-update-7ks4n\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.938872 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.952658 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8795dd98-pn9n4" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:59494->10.217.0.179:9311: read: connection reset by peer" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.953059 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8795dd98-pn9n4" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:59492->10.217.0.179:9311: read: connection reset by peer" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955196 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6w6\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-kube-api-access-rj6w6\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955279 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-internal-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955311 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-config-data\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955334 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-combined-ca-bundle\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955356 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-log-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-public-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955417 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-etc-swift\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-run-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.955953 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-run-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.956187 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a76ff35f-36d6-48df-94ed-337199547cd5-log-httpd\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.969058 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-111f-account-create-update-j4mlh"] Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.974183 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-combined-ca-bundle\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.976179 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-internal-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.978058 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.982445 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-public-tls-certs\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.984373 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.984445 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.984499 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.985539 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.985602 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b" gracePeriod=600 Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.987423 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-etc-swift\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.987617 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.988838 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ff35f-36d6-48df-94ed-337199547cd5-config-data\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:22 crc kubenswrapper[4696]: I1202 23:03:22.998377 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6w6\" (UniqueName: \"kubernetes.io/projected/a76ff35f-36d6-48df-94ed-337199547cd5-kube-api-access-rj6w6\") pod \"swift-proxy-5964f98dd9-7q2kj\" (UID: \"a76ff35f-36d6-48df-94ed-337199547cd5\") " pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.000808 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-111f-account-create-update-j4mlh"] Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.062445 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.063135 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hms6\" (UniqueName: \"kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.159363 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.167358 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.167437 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hms6\" (UniqueName: \"kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.168584 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.195507 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hms6\" (UniqueName: \"kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6\") pod \"nova-cell1-111f-account-create-update-j4mlh\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.346428 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.366068 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67dbcf9bdf-2hr7m" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.486639 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tc9xp"] Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.491755 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.492342 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-97757dbdd-59bbj" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-api" containerID="cri-o://60ac838900bf957583ef488ecefb176051ebf222807e7c4ff7aa7356a437b6fb" gracePeriod=30 Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.492990 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-97757dbdd-59bbj" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-httpd" containerID="cri-o://d6d5cbdfb9eb49771c7911e8f5aea5c7f87ec56679cfdb4bad9be2184b1f616d" gracePeriod=30 Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.493887 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b" exitCode=0 Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.493977 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b"} Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.494020 4696 scope.go:117] "RemoveContainer" containerID="d851463a087e8da5113eee7095bcc5e11085a475884d43c64676423d484437b6" Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.551618 4696 generic.go:334] "Generic (PLEG): container finished" podID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerID="76fe1411adece83854009d383d32c2892877828d13f137e8d83fbf17082809e4" exitCode=0 Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.555479 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerDied","Data":"76fe1411adece83854009d383d32c2892877828d13f137e8d83fbf17082809e4"} Dec 02 23:03:23 crc kubenswrapper[4696]: I1202 23:03:23.559569 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4f32-account-create-update-fg4wn"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.019951 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-svgf5"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.060350 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.107443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle\") pod \"d897a4ce-a62c-4bde-889d-c7c82cab0569\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.107849 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom\") pod \"d897a4ce-a62c-4bde-889d-c7c82cab0569\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.108029 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs\") pod \"d897a4ce-a62c-4bde-889d-c7c82cab0569\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.114390 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs" (OuterVolumeSpecName: "logs") pod "d897a4ce-a62c-4bde-889d-c7c82cab0569" (UID: "d897a4ce-a62c-4bde-889d-c7c82cab0569"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.118915 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d897a4ce-a62c-4bde-889d-c7c82cab0569" (UID: "d897a4ce-a62c-4bde-889d-c7c82cab0569"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.158898 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d897a4ce-a62c-4bde-889d-c7c82cab0569" (UID: "d897a4ce-a62c-4bde-889d-c7c82cab0569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.214610 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpngr\" (UniqueName: \"kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr\") pod \"d897a4ce-a62c-4bde-889d-c7c82cab0569\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.215085 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data\") pod \"d897a4ce-a62c-4bde-889d-c7c82cab0569\" (UID: \"d897a4ce-a62c-4bde-889d-c7c82cab0569\") " Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.216073 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.216092 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d897a4ce-a62c-4bde-889d-c7c82cab0569-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.216106 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.248834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr" (OuterVolumeSpecName: "kube-api-access-fpngr") pod "d897a4ce-a62c-4bde-889d-c7c82cab0569" (UID: "d897a4ce-a62c-4bde-889d-c7c82cab0569"). InnerVolumeSpecName "kube-api-access-fpngr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.318243 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpngr\" (UniqueName: \"kubernetes.io/projected/d897a4ce-a62c-4bde-889d-c7c82cab0569-kube-api-access-fpngr\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.318291 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eff7-account-create-update-7ks4n"] Dec 02 23:03:24 crc kubenswrapper[4696]: W1202 23:03:24.358329 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4756827_862e_446f_a149_b3a541a656b5.slice/crio-9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e WatchSource:0}: Error finding container 9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e: Status 404 returned error can't find the container with id 9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.370159 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tkf2x"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.376567 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data" (OuterVolumeSpecName: "config-data") pod "d897a4ce-a62c-4bde-889d-c7c82cab0569" (UID: "d897a4ce-a62c-4bde-889d-c7c82cab0569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.420701 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d897a4ce-a62c-4bde-889d-c7c82cab0569-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.455547 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5964f98dd9-7q2kj"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.464558 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-111f-account-create-update-j4mlh"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.591512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5964f98dd9-7q2kj" event={"ID":"a76ff35f-36d6-48df-94ed-337199547cd5","Type":"ContainerStarted","Data":"1dcd05444939376cadd5c9ac60f420df1291bf33019f86dd2834f0893cc95834"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.602045 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tc9xp" event={"ID":"5d3c7135-a850-4cd0-b5b6-2561b75cd09b","Type":"ContainerStarted","Data":"2466eab4b8fe1cd3b46530dc9514800eb7a59e245135327216bd1f748df79e1d"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.602096 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tc9xp" event={"ID":"5d3c7135-a850-4cd0-b5b6-2561b75cd09b","Type":"ContainerStarted","Data":"f3e5d778b3f6ad2265481d982eb9e4c5ea04d054bdb645a9c9452c0d71e83717"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.622678 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f32-account-create-update-fg4wn" event={"ID":"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a","Type":"ContainerStarted","Data":"92ccdcd4cebba044da3501499bb42eb368c1a67fbdbca8e88c12c3bdc628b413"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.622727 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f32-account-create-update-fg4wn" event={"ID":"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a","Type":"ContainerStarted","Data":"dfd1b6ee52239d5bfea1c190a326c8164b23d40799629f95aa87d2b142995da7"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.645606 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tc9xp" podStartSLOduration=2.6455834769999997 podStartE2EDuration="2.645583477s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:24.622456541 +0000 UTC m=+1267.503136542" watchObservedRunningTime="2025-12-02 23:03:24.645583477 +0000 UTC m=+1267.526263478" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.656386 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-4f32-account-create-update-fg4wn" podStartSLOduration=2.656347143 podStartE2EDuration="2.656347143s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:24.650220879 +0000 UTC m=+1267.530900880" watchObservedRunningTime="2025-12-02 23:03:24.656347143 +0000 UTC m=+1267.537027144" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.668753 4696 generic.go:334] "Generic (PLEG): container finished" podID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerID="d6d5cbdfb9eb49771c7911e8f5aea5c7f87ec56679cfdb4bad9be2184b1f616d" exitCode=0 Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.668851 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerDied","Data":"d6d5cbdfb9eb49771c7911e8f5aea5c7f87ec56679cfdb4bad9be2184b1f616d"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.689240 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.690378 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8795dd98-pn9n4" event={"ID":"d897a4ce-a62c-4bde-889d-c7c82cab0569","Type":"ContainerDied","Data":"ffe626ddd5ba4bd0146debc6551375d48a40363b07620e239e512ee12c072a56"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.690465 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8795dd98-pn9n4" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.690523 4696 scope.go:117] "RemoveContainer" containerID="76fe1411adece83854009d383d32c2892877828d13f137e8d83fbf17082809e4" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.704083 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-svgf5" event={"ID":"b9d71614-bd9c-4b09-b813-d9d6f01fdc92","Type":"ContainerStarted","Data":"0f20a6b2e9ef1d274da099a3b52a5fee338af3dcd01d0138a163ada2bde5cfe0"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.704164 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-svgf5" event={"ID":"b9d71614-bd9c-4b09-b813-d9d6f01fdc92","Type":"ContainerStarted","Data":"7806710b9331138df92161a336e07eb73c9f4d066b625a36204fbdf98c98a6a3"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.719340 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" event={"ID":"a4756827-862e-446f-a149-b3a541a656b5","Type":"ContainerStarted","Data":"9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.731282 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-svgf5" podStartSLOduration=2.731250629 podStartE2EDuration="2.731250629s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:24.721486092 +0000 UTC m=+1267.602166093" watchObservedRunningTime="2025-12-02 23:03:24.731250629 +0000 UTC m=+1267.611930640" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.734070 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" event={"ID":"1172a4a0-f1c5-49f2-b91a-e691b431c471","Type":"ContainerStarted","Data":"23891f31fcdf9285d9c19e415a791c9c41cadf93595ae83bfd15f73bb41038d5"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.756227 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tkf2x" event={"ID":"9f4f26b7-e810-4c1e-98b1-57fc3b417e60","Type":"ContainerStarted","Data":"77b50eae0a351194c4cbdacd72cfab6c11208c35fbe95de06af294c82bed46f4"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.763098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9"} Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.838100 4696 scope.go:117] "RemoveContainer" containerID="5808956f2468d868426eebb3d36fb11961e1d0742b940753b107c5cd877fb67c" Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.985444 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:24 crc kubenswrapper[4696]: I1202 23:03:24.998439 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7f8795dd98-pn9n4"] Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.448091 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" path="/var/lib/kubelet/pods/d897a4ce-a62c-4bde-889d-c7c82cab0569/volumes" Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.717806 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.723836 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-central-agent" containerID="cri-o://eedf30edd100f822a25c5db9c6414d00352ced217607db5662ea488255d326d8" gracePeriod=30 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.724123 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="proxy-httpd" containerID="cri-o://804daa95a3aad3872c09836efc0fb67463a6bfdbf3e36ed78514da9e876e450e" gracePeriod=30 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.724245 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="sg-core" containerID="cri-o://c16413dcbb0a7708424b716c40c5fdd25b814a6800d0d3ca9c029a003b17da46" gracePeriod=30 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.724384 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-notification-agent" containerID="cri-o://57355eed1547b03775dcae1050e96bfe01be9a37a75ae3f193337bc37344c75b" gracePeriod=30 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.787897 4696 generic.go:334] "Generic (PLEG): container finished" podID="5d3c7135-a850-4cd0-b5b6-2561b75cd09b" containerID="2466eab4b8fe1cd3b46530dc9514800eb7a59e245135327216bd1f748df79e1d" exitCode=0 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.787973 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tc9xp" event={"ID":"5d3c7135-a850-4cd0-b5b6-2561b75cd09b","Type":"ContainerDied","Data":"2466eab4b8fe1cd3b46530dc9514800eb7a59e245135327216bd1f748df79e1d"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.791873 4696 generic.go:334] "Generic (PLEG): container finished" podID="b9d71614-bd9c-4b09-b813-d9d6f01fdc92" containerID="0f20a6b2e9ef1d274da099a3b52a5fee338af3dcd01d0138a163ada2bde5cfe0" exitCode=0 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.791920 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-svgf5" event={"ID":"b9d71614-bd9c-4b09-b813-d9d6f01fdc92","Type":"ContainerDied","Data":"0f20a6b2e9ef1d274da099a3b52a5fee338af3dcd01d0138a163ada2bde5cfe0"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.796002 4696 generic.go:334] "Generic (PLEG): container finished" podID="ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" containerID="92ccdcd4cebba044da3501499bb42eb368c1a67fbdbca8e88c12c3bdc628b413" exitCode=0 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.796109 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f32-account-create-update-fg4wn" event={"ID":"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a","Type":"ContainerDied","Data":"92ccdcd4cebba044da3501499bb42eb368c1a67fbdbca8e88c12c3bdc628b413"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.801249 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" event={"ID":"a4756827-862e-446f-a149-b3a541a656b5","Type":"ContainerStarted","Data":"8a7494df1d3e76227818c7a36e371393352e7e15f56ad38eb20d721912e42181"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.808485 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" event={"ID":"1172a4a0-f1c5-49f2-b91a-e691b431c471","Type":"ContainerStarted","Data":"4ca47c290ac0c7eaea73a12196ebd3a5d3997faa9cdc63ad37d6e2326b0492a7"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.813454 4696 generic.go:334] "Generic (PLEG): container finished" podID="9f4f26b7-e810-4c1e-98b1-57fc3b417e60" containerID="17296d467a9aa446ef6f6eb8a02686b348027b45542b32c459fae79438f20459" exitCode=0 Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.813519 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tkf2x" event={"ID":"9f4f26b7-e810-4c1e-98b1-57fc3b417e60","Type":"ContainerDied","Data":"17296d467a9aa446ef6f6eb8a02686b348027b45542b32c459fae79438f20459"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.823350 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5964f98dd9-7q2kj" event={"ID":"a76ff35f-36d6-48df-94ed-337199547cd5","Type":"ContainerStarted","Data":"947ad0b835511e62883ad7aa36fe920e3f1dffaccc2c02f16f96f25532a9e483"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.823441 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5964f98dd9-7q2kj" event={"ID":"a76ff35f-36d6-48df-94ed-337199547cd5","Type":"ContainerStarted","Data":"cb506f907f285b4b2d3d541e24c55dbfe649faf632f7b1569c5d5528a40005c8"} Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.823490 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" podStartSLOduration=3.823469309 podStartE2EDuration="3.823469309s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:25.815714809 +0000 UTC m=+1268.696394810" watchObservedRunningTime="2025-12-02 23:03:25.823469309 +0000 UTC m=+1268.704149310" Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.924090 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5964f98dd9-7q2kj" podStartSLOduration=3.924066045 podStartE2EDuration="3.924066045s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:25.871582835 +0000 UTC m=+1268.752262836" watchObservedRunningTime="2025-12-02 23:03:25.924066045 +0000 UTC m=+1268.804746046" Dec 02 23:03:25 crc kubenswrapper[4696]: I1202 23:03:25.928651 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" podStartSLOduration=3.928643615 podStartE2EDuration="3.928643615s" podCreationTimestamp="2025-12-02 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:25.889938076 +0000 UTC m=+1268.770618067" watchObservedRunningTime="2025-12-02 23:03:25.928643615 +0000 UTC m=+1268.809323616" Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.870574 4696 generic.go:334] "Generic (PLEG): container finished" podID="92dde787-a077-43f6-a220-7862b3b296b1" containerID="804daa95a3aad3872c09836efc0fb67463a6bfdbf3e36ed78514da9e876e450e" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871349 4696 generic.go:334] "Generic (PLEG): container finished" podID="92dde787-a077-43f6-a220-7862b3b296b1" containerID="c16413dcbb0a7708424b716c40c5fdd25b814a6800d0d3ca9c029a003b17da46" exitCode=2 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871360 4696 generic.go:334] "Generic (PLEG): container finished" podID="92dde787-a077-43f6-a220-7862b3b296b1" containerID="57355eed1547b03775dcae1050e96bfe01be9a37a75ae3f193337bc37344c75b" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871368 4696 generic.go:334] "Generic (PLEG): container finished" podID="92dde787-a077-43f6-a220-7862b3b296b1" containerID="eedf30edd100f822a25c5db9c6414d00352ced217607db5662ea488255d326d8" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871441 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerDied","Data":"804daa95a3aad3872c09836efc0fb67463a6bfdbf3e36ed78514da9e876e450e"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871477 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerDied","Data":"c16413dcbb0a7708424b716c40c5fdd25b814a6800d0d3ca9c029a003b17da46"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871488 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerDied","Data":"57355eed1547b03775dcae1050e96bfe01be9a37a75ae3f193337bc37344c75b"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.871499 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerDied","Data":"eedf30edd100f822a25c5db9c6414d00352ced217607db5662ea488255d326d8"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.873896 4696 generic.go:334] "Generic (PLEG): container finished" podID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerID="60ac838900bf957583ef488ecefb176051ebf222807e7c4ff7aa7356a437b6fb" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.873942 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerDied","Data":"60ac838900bf957583ef488ecefb176051ebf222807e7c4ff7aa7356a437b6fb"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.880550 4696 generic.go:334] "Generic (PLEG): container finished" podID="a4756827-862e-446f-a149-b3a541a656b5" containerID="8a7494df1d3e76227818c7a36e371393352e7e15f56ad38eb20d721912e42181" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.880651 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" event={"ID":"a4756827-862e-446f-a149-b3a541a656b5","Type":"ContainerDied","Data":"8a7494df1d3e76227818c7a36e371393352e7e15f56ad38eb20d721912e42181"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.892560 4696 generic.go:334] "Generic (PLEG): container finished" podID="1172a4a0-f1c5-49f2-b91a-e691b431c471" containerID="4ca47c290ac0c7eaea73a12196ebd3a5d3997faa9cdc63ad37d6e2326b0492a7" exitCode=0 Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.892690 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" event={"ID":"1172a4a0-f1c5-49f2-b91a-e691b431c471","Type":"ContainerDied","Data":"4ca47c290ac0c7eaea73a12196ebd3a5d3997faa9cdc63ad37d6e2326b0492a7"} Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.893445 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:26 crc kubenswrapper[4696]: I1202 23:03:26.893618 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:29 crc kubenswrapper[4696]: I1202 23:03:29.925398 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.258712 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b448778f6-q69jq" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.259413 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.964803 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.974617 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.977453 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.985221 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tkf2x" event={"ID":"9f4f26b7-e810-4c1e-98b1-57fc3b417e60","Type":"ContainerDied","Data":"77b50eae0a351194c4cbdacd72cfab6c11208c35fbe95de06af294c82bed46f4"} Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.985271 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b50eae0a351194c4cbdacd72cfab6c11208c35fbe95de06af294c82bed46f4" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.988911 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97757dbdd-59bbj" event={"ID":"b08f301e-8c9b-4d88-9a26-431a6c15a6ca","Type":"ContainerDied","Data":"5f5a15cb3050ce6bda3d8a9233fdaaabadff59759dd9cf7b5fda7740281e293c"} Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.988976 4696 scope.go:117] "RemoveContainer" containerID="d6d5cbdfb9eb49771c7911e8f5aea5c7f87ec56679cfdb4bad9be2184b1f616d" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.989116 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97757dbdd-59bbj" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.992440 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.993531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-svgf5" event={"ID":"b9d71614-bd9c-4b09-b813-d9d6f01fdc92","Type":"ContainerDied","Data":"7806710b9331138df92161a336e07eb73c9f4d066b625a36204fbdf98c98a6a3"} Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.993575 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7806710b9331138df92161a336e07eb73c9f4d066b625a36204fbdf98c98a6a3" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.993631 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-svgf5" Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.997301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tc9xp" event={"ID":"5d3c7135-a850-4cd0-b5b6-2561b75cd09b","Type":"ContainerDied","Data":"f3e5d778b3f6ad2265481d982eb9e4c5ea04d054bdb645a9c9452c0d71e83717"} Dec 02 23:03:32 crc kubenswrapper[4696]: I1202 23:03:32.997331 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e5d778b3f6ad2265481d982eb9e4c5ea04d054bdb645a9c9452c0d71e83717" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.000756 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" event={"ID":"1172a4a0-f1c5-49f2-b91a-e691b431c471","Type":"ContainerDied","Data":"23891f31fcdf9285d9c19e415a791c9c41cadf93595ae83bfd15f73bb41038d5"} Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.000924 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23891f31fcdf9285d9c19e415a791c9c41cadf93595ae83bfd15f73bb41038d5" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.010668 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.010735 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92dde787-a077-43f6-a220-7862b3b296b1","Type":"ContainerDied","Data":"491b661aa31e58772cb121b7938accf2914b85231a58835d3396d9ff42692ec6"} Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.016915 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4f32-account-create-update-fg4wn" event={"ID":"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a","Type":"ContainerDied","Data":"dfd1b6ee52239d5bfea1c190a326c8164b23d40799629f95aa87d2b142995da7"} Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.017028 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd1b6ee52239d5bfea1c190a326c8164b23d40799629f95aa87d2b142995da7" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.017140 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4f32-account-create-update-fg4wn" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.022549 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.023647 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" event={"ID":"a4756827-862e-446f-a149-b3a541a656b5","Type":"ContainerDied","Data":"9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e"} Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.023681 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb9e953b0d28d012f277a524e07cb809ee2a57c890d2d5ef74e3088ec01e10e" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.052719 4696 scope.go:117] "RemoveContainer" containerID="60ac838900bf957583ef488ecefb176051ebf222807e7c4ff7aa7356a437b6fb" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.092668 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.110990 4696 scope.go:117] "RemoveContainer" containerID="804daa95a3aad3872c09836efc0fb67463a6bfdbf3e36ed78514da9e876e450e" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.128860 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.143196 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.148656 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config\") pod \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.148716 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config\") pod \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.149231 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.149263 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts\") pod \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.149894 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts\") pod \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.149940 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts\") pod \"1172a4a0-f1c5-49f2-b91a-e691b431c471\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.149974 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjv9d\" (UniqueName: \"kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d\") pod \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150001 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150020 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs\") pod \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150037 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gp2q\" (UniqueName: \"kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q\") pod \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150073 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150121 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw8zk\" (UniqueName: \"kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk\") pod \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\" (UID: \"b9d71614-bd9c-4b09-b813-d9d6f01fdc92\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150144 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle\") pod \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150164 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150188 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150238 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5zv\" (UniqueName: \"kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv\") pod \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\" (UID: \"ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150263 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq7jx\" (UniqueName: \"kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx\") pod \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\" (UID: \"b08f301e-8c9b-4d88-9a26-431a6c15a6ca\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150287 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts\") pod \"a4756827-862e-446f-a149-b3a541a656b5\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150317 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w9h6\" (UniqueName: \"kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6\") pod \"a4756827-862e-446f-a149-b3a541a656b5\" (UID: \"a4756827-862e-446f-a149-b3a541a656b5\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150340 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml\") pod \"92dde787-a077-43f6-a220-7862b3b296b1\" (UID: \"92dde787-a077-43f6-a220-7862b3b296b1\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.150367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts\") pod \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\" (UID: \"9f4f26b7-e810-4c1e-98b1-57fc3b417e60\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.151842 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f4f26b7-e810-4c1e-98b1-57fc3b417e60" (UID: "9f4f26b7-e810-4c1e-98b1-57fc3b417e60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.153127 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.153486 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" (UID: "ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.153822 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9d71614-bd9c-4b09-b813-d9d6f01fdc92" (UID: "b9d71614-bd9c-4b09-b813-d9d6f01fdc92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.154176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1172a4a0-f1c5-49f2-b91a-e691b431c471" (UID: "1172a4a0-f1c5-49f2-b91a-e691b431c471"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.154671 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4756827-862e-446f-a149-b3a541a656b5" (UID: "a4756827-862e-446f-a149-b3a541a656b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.158353 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk" (OuterVolumeSpecName: "kube-api-access-fw8zk") pod "b9d71614-bd9c-4b09-b813-d9d6f01fdc92" (UID: "b9d71614-bd9c-4b09-b813-d9d6f01fdc92"). InnerVolumeSpecName "kube-api-access-fw8zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.160448 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.175105 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d" (OuterVolumeSpecName: "kube-api-access-fjv9d") pod "5d3c7135-a850-4cd0-b5b6-2561b75cd09b" (UID: "5d3c7135-a850-4cd0-b5b6-2561b75cd09b"). InnerVolumeSpecName "kube-api-access-fjv9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.177169 4696 scope.go:117] "RemoveContainer" containerID="c16413dcbb0a7708424b716c40c5fdd25b814a6800d0d3ca9c029a003b17da46" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.177231 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv" (OuterVolumeSpecName: "kube-api-access-4s5zv") pod "ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" (UID: "ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a"). InnerVolumeSpecName "kube-api-access-4s5zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.177407 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.182123 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts" (OuterVolumeSpecName: "scripts") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.184119 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5964f98dd9-7q2kj" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.185693 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b08f301e-8c9b-4d88-9a26-431a6c15a6ca" (UID: "b08f301e-8c9b-4d88-9a26-431a6c15a6ca"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.188384 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6" (OuterVolumeSpecName: "kube-api-access-8w9h6") pod "a4756827-862e-446f-a149-b3a541a656b5" (UID: "a4756827-862e-446f-a149-b3a541a656b5"). InnerVolumeSpecName "kube-api-access-8w9h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.188541 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p" (OuterVolumeSpecName: "kube-api-access-mkz4p") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "kube-api-access-mkz4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.199651 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx" (OuterVolumeSpecName: "kube-api-access-pq7jx") pod "b08f301e-8c9b-4d88-9a26-431a6c15a6ca" (UID: "b08f301e-8c9b-4d88-9a26-431a6c15a6ca"). InnerVolumeSpecName "kube-api-access-pq7jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.200120 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q" (OuterVolumeSpecName: "kube-api-access-2gp2q") pod "9f4f26b7-e810-4c1e-98b1-57fc3b417e60" (UID: "9f4f26b7-e810-4c1e-98b1-57fc3b417e60"). InnerVolumeSpecName "kube-api-access-2gp2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.252177 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts\") pod \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\" (UID: \"5d3c7135-a850-4cd0-b5b6-2561b75cd09b\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.252287 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hms6\" (UniqueName: \"kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6\") pod \"1172a4a0-f1c5-49f2-b91a-e691b431c471\" (UID: \"1172a4a0-f1c5-49f2-b91a-e691b431c471\") " Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253008 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4756827-862e-446f-a149-b3a541a656b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253026 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w9h6\" (UniqueName: \"kubernetes.io/projected/a4756827-862e-446f-a149-b3a541a656b5-kube-api-access-8w9h6\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253038 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253049 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253057 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253065 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253073 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253081 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1172a4a0-f1c5-49f2-b91a-e691b431c471-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253089 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjv9d\" (UniqueName: \"kubernetes.io/projected/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-kube-api-access-fjv9d\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253097 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92dde787-a077-43f6-a220-7862b3b296b1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253105 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gp2q\" (UniqueName: \"kubernetes.io/projected/9f4f26b7-e810-4c1e-98b1-57fc3b417e60-kube-api-access-2gp2q\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253114 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw8zk\" (UniqueName: \"kubernetes.io/projected/b9d71614-bd9c-4b09-b813-d9d6f01fdc92-kube-api-access-fw8zk\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253123 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/92dde787-a077-43f6-a220-7862b3b296b1-kube-api-access-mkz4p\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253133 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253142 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s5zv\" (UniqueName: \"kubernetes.io/projected/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a-kube-api-access-4s5zv\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253151 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq7jx\" (UniqueName: \"kubernetes.io/projected/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-kube-api-access-pq7jx\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.253920 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d3c7135-a850-4cd0-b5b6-2561b75cd09b" (UID: "5d3c7135-a850-4cd0-b5b6-2561b75cd09b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.260348 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config" (OuterVolumeSpecName: "config") pod "b08f301e-8c9b-4d88-9a26-431a6c15a6ca" (UID: "b08f301e-8c9b-4d88-9a26-431a6c15a6ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.269018 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6" (OuterVolumeSpecName: "kube-api-access-9hms6") pod "1172a4a0-f1c5-49f2-b91a-e691b431c471" (UID: "1172a4a0-f1c5-49f2-b91a-e691b431c471"). InnerVolumeSpecName "kube-api-access-9hms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.284893 4696 scope.go:117] "RemoveContainer" containerID="57355eed1547b03775dcae1050e96bfe01be9a37a75ae3f193337bc37344c75b" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.310881 4696 scope.go:117] "RemoveContainer" containerID="eedf30edd100f822a25c5db9c6414d00352ced217607db5662ea488255d326d8" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.354989 4696 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3c7135-a850-4cd0-b5b6-2561b75cd09b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.355219 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hms6\" (UniqueName: \"kubernetes.io/projected/1172a4a0-f1c5-49f2-b91a-e691b431c471-kube-api-access-9hms6\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.355335 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.358020 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.393691 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b08f301e-8c9b-4d88-9a26-431a6c15a6ca" (UID: "b08f301e-8c9b-4d88-9a26-431a6c15a6ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.394122 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b08f301e-8c9b-4d88-9a26-431a6c15a6ca" (UID: "b08f301e-8c9b-4d88-9a26-431a6c15a6ca"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.422626 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.445991 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data" (OuterVolumeSpecName: "config-data") pod "92dde787-a077-43f6-a220-7862b3b296b1" (UID: "92dde787-a077-43f6-a220-7862b3b296b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.458932 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.458984 4696 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.459000 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.459016 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08f301e-8c9b-4d88-9a26-431a6c15a6ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.459031 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92dde787-a077-43f6-a220-7862b3b296b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.620723 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.630663 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-97757dbdd-59bbj"] Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.640633 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.652711 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.664597 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665132 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="proxy-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665158 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="proxy-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665182 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="sg-core" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665191 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="sg-core" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665215 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665221 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665232 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4f26b7-e810-4c1e-98b1-57fc3b417e60" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665241 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4f26b7-e810-4c1e-98b1-57fc3b417e60" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665252 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-notification-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665258 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-notification-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665272 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-api" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665279 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-api" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665292 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3c7135-a850-4cd0-b5b6-2561b75cd09b" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665299 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3c7135-a850-4cd0-b5b6-2561b75cd09b" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665309 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4756827-862e-446f-a149-b3a541a656b5" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665316 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4756827-862e-446f-a149-b3a541a656b5" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665328 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665335 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665345 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-central-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665352 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-central-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665364 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665370 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665378 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d71614-bd9c-4b09-b813-d9d6f01fdc92" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665386 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d71614-bd9c-4b09-b813-d9d6f01fdc92" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665403 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665409 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" Dec 02 23:03:33 crc kubenswrapper[4696]: E1202 23:03:33.665417 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1172a4a0-f1c5-49f2-b91a-e691b431c471" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.665425 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="1172a4a0-f1c5-49f2-b91a-e691b431c471" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666203 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4756827-862e-446f-a149-b3a541a656b5" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666220 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666230 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666243 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" containerName="neutron-api" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666257 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="1172a4a0-f1c5-49f2-b91a-e691b431c471" containerName="mariadb-account-create-update" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666265 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="proxy-httpd" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666273 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3c7135-a850-4cd0-b5b6-2561b75cd09b" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666285 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4f26b7-e810-4c1e-98b1-57fc3b417e60" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666292 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api-log" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666303 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-central-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666309 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="ceilometer-notification-agent" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666325 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d71614-bd9c-4b09-b813-d9d6f01fdc92" containerName="mariadb-database-create" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666342 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d897a4ce-a62c-4bde-889d-c7c82cab0569" containerName="barbican-api" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.666358 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dde787-a077-43f6-a220-7862b3b296b1" containerName="sg-core" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.668715 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.671708 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.671831 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.683764 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867648 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867686 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867711 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867751 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.867780 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.868897 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqr2k\" (UniqueName: \"kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.972119 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.972566 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.972698 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.972847 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.972946 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.973045 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.973170 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqr2k\" (UniqueName: \"kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.973631 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.973934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.981984 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.982826 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.987234 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:33 crc kubenswrapper[4696]: I1202 23:03:33.992226 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.018621 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqr2k\" (UniqueName: \"kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k\") pod \"ceilometer-0\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " pod="openstack/ceilometer-0" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.042405 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tc9xp" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.042435 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tkf2x" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.042499 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3e0a050b-c652-4ef2-8f1a-19c8f4732a0c","Type":"ContainerStarted","Data":"6dfff33d8c71d310c2c11b403f62ec7b4ab4d0cef6812ec5d0d7bcc6856fe7ba"} Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.042686 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eff7-account-create-update-7ks4n" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.043034 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-111f-account-create-update-j4mlh" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.068493 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.763842458 podStartE2EDuration="21.068471016s" podCreationTimestamp="2025-12-02 23:03:13 +0000 UTC" firstStartedPulling="2025-12-02 23:03:14.563647073 +0000 UTC m=+1257.444327074" lastFinishedPulling="2025-12-02 23:03:32.868275631 +0000 UTC m=+1275.748955632" observedRunningTime="2025-12-02 23:03:34.065258895 +0000 UTC m=+1276.945938896" watchObservedRunningTime="2025-12-02 23:03:34.068471016 +0000 UTC m=+1276.949151037" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.297023 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.768155 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:03:34 crc kubenswrapper[4696]: I1202 23:03:34.771570 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:35 crc kubenswrapper[4696]: I1202 23:03:35.055126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerStarted","Data":"0ec9de4819fbab9b6604d25faefbfdf70c4aa9f08d089d6596574c0de7a6d18e"} Dec 02 23:03:35 crc kubenswrapper[4696]: I1202 23:03:35.443596 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dde787-a077-43f6-a220-7862b3b296b1" path="/var/lib/kubelet/pods/92dde787-a077-43f6-a220-7862b3b296b1/volumes" Dec 02 23:03:35 crc kubenswrapper[4696]: I1202 23:03:35.445458 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08f301e-8c9b-4d88-9a26-431a6c15a6ca" path="/var/lib/kubelet/pods/b08f301e-8c9b-4d88-9a26-431a6c15a6ca/volumes" Dec 02 23:03:36 crc kubenswrapper[4696]: I1202 23:03:36.069037 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerStarted","Data":"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc"} Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.090279 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerStarted","Data":"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1"} Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.091053 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerStarted","Data":"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad"} Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.820121 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89vr8"] Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.821871 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.828149 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.828433 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cgfs4" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.828554 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.852308 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89vr8"] Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.970117 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.970976 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbbn9\" (UniqueName: \"kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.971024 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:37 crc kubenswrapper[4696]: I1202 23:03:37.971189 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.073419 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.074665 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.074709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbbn9\" (UniqueName: \"kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.074758 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.083078 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.083161 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.087625 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.098076 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbbn9\" (UniqueName: \"kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9\") pod \"nova-cell0-conductor-db-sync-89vr8\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.144641 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:03:38 crc kubenswrapper[4696]: I1202 23:03:38.648712 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89vr8"] Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.157634 4696 generic.go:334] "Generic (PLEG): container finished" podID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerID="310dfbca22cf7acbe78369c9d15f3b0816fd2bbde6d845cfdc16889f994d00a0" exitCode=137 Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.157894 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerDied","Data":"310dfbca22cf7acbe78369c9d15f3b0816fd2bbde6d845cfdc16889f994d00a0"} Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.164302 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerStarted","Data":"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a"} Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.164615 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.169493 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89vr8" event={"ID":"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27","Type":"ContainerStarted","Data":"c8df4d8a1e40c8f59719a7463d12508117240c17deef749eb942910177ab27d5"} Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.209274 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.488090618 podStartE2EDuration="6.209243615s" podCreationTimestamp="2025-12-02 23:03:33 +0000 UTC" firstStartedPulling="2025-12-02 23:03:34.767888237 +0000 UTC m=+1277.648568238" lastFinishedPulling="2025-12-02 23:03:38.489041234 +0000 UTC m=+1281.369721235" observedRunningTime="2025-12-02 23:03:39.199977992 +0000 UTC m=+1282.080658013" watchObservedRunningTime="2025-12-02 23:03:39.209243615 +0000 UTC m=+1282.089923616" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.606304 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.625875 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626016 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pdw\" (UniqueName: \"kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626392 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626444 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626554 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626633 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.626662 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs\") pod \"a20e4421-fec8-4f5a-8699-9d17d911f14c\" (UID: \"a20e4421-fec8-4f5a-8699-9d17d911f14c\") " Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.627812 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs" (OuterVolumeSpecName: "logs") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.634922 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.641199 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw" (OuterVolumeSpecName: "kube-api-access-p5pdw") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "kube-api-access-p5pdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.641283 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts" (OuterVolumeSpecName: "scripts") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.676878 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.696769 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.722793 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data" (OuterVolumeSpecName: "config-data") pod "a20e4421-fec8-4f5a-8699-9d17d911f14c" (UID: "a20e4421-fec8-4f5a-8699-9d17d911f14c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730301 4696 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a20e4421-fec8-4f5a-8699-9d17d911f14c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730339 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730350 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730363 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a20e4421-fec8-4f5a-8699-9d17d911f14c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730378 4696 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730390 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5pdw\" (UniqueName: \"kubernetes.io/projected/a20e4421-fec8-4f5a-8699-9d17d911f14c-kube-api-access-p5pdw\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.730405 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20e4421-fec8-4f5a-8699-9d17d911f14c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:39 crc kubenswrapper[4696]: I1202 23:03:39.775716 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.182399 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.182422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a20e4421-fec8-4f5a-8699-9d17d911f14c","Type":"ContainerDied","Data":"062edb233c0c07fd1b83c75d55340df53205e72d5e80f61bed85c3f5751b57dd"} Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.182508 4696 scope.go:117] "RemoveContainer" containerID="310dfbca22cf7acbe78369c9d15f3b0816fd2bbde6d845cfdc16889f994d00a0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.185408 4696 generic.go:334] "Generic (PLEG): container finished" podID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerID="db7c41afc2c4141b03c10301aee7cd5a2b37a58c34385a505278ac8b3e2f3bf7" exitCode=137 Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.186637 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerDied","Data":"db7c41afc2c4141b03c10301aee7cd5a2b37a58c34385a505278ac8b3e2f3bf7"} Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.216410 4696 scope.go:117] "RemoveContainer" containerID="84eb3da8d9726989cf23638271ecf20b06b0c5ee0ddc8acf16c0cc25deccfa4b" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.231299 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.247293 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.258497 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:40 crc kubenswrapper[4696]: E1202 23:03:40.259156 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api-log" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.259181 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api-log" Dec 02 23:03:40 crc kubenswrapper[4696]: E1202 23:03:40.259223 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.259233 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.259506 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api-log" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.259537 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" containerName="cinder-api" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.261140 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.263868 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.264092 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.264125 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.275140 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343361 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data-custom\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz5gk\" (UniqueName: \"kubernetes.io/projected/e0a0bd09-55c1-4eb0-bed1-76a920e67875-kube-api-access-kz5gk\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343418 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343442 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-scripts\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343465 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343488 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0a0bd09-55c1-4eb0-bed1-76a920e67875-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343541 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0bd09-55c1-4eb0-bed1-76a920e67875-logs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.343582 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446269 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0bd09-55c1-4eb0-bed1-76a920e67875-logs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446348 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446451 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446492 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data-custom\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446515 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz5gk\" (UniqueName: \"kubernetes.io/projected/e0a0bd09-55c1-4eb0-bed1-76a920e67875-kube-api-access-kz5gk\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446536 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446557 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-scripts\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446585 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446613 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0a0bd09-55c1-4eb0-bed1-76a920e67875-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446716 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0a0bd09-55c1-4eb0-bed1-76a920e67875-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.446790 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0bd09-55c1-4eb0-bed1-76a920e67875-logs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.453422 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data-custom\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.454519 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.454558 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.455171 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.457350 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-scripts\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.463472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a0bd09-55c1-4eb0-bed1-76a920e67875-config-data\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.476072 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz5gk\" (UniqueName: \"kubernetes.io/projected/e0a0bd09-55c1-4eb0-bed1-76a920e67875-kube-api-access-kz5gk\") pod \"cinder-api-0\" (UID: \"e0a0bd09-55c1-4eb0-bed1-76a920e67875\") " pod="openstack/cinder-api-0" Dec 02 23:03:40 crc kubenswrapper[4696]: I1202 23:03:40.582987 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.204703 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-central-agent" containerID="cri-o://498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc" gracePeriod=30 Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.204757 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="sg-core" containerID="cri-o://bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1" gracePeriod=30 Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.204773 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="proxy-httpd" containerID="cri-o://ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a" gracePeriod=30 Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.204832 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-notification-agent" containerID="cri-o://0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad" gracePeriod=30 Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.351895 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:03:41 crc kubenswrapper[4696]: W1202 23:03:41.362675 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a0bd09_55c1_4eb0_bed1_76a920e67875.slice/crio-bf13cb1be4374ecd81574a20a417efbfaae0b86c3db62f54b35abcf682106fb5 WatchSource:0}: Error finding container bf13cb1be4374ecd81574a20a417efbfaae0b86c3db62f54b35abcf682106fb5: Status 404 returned error can't find the container with id bf13cb1be4374ecd81574a20a417efbfaae0b86c3db62f54b35abcf682106fb5 Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.452449 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20e4421-fec8-4f5a-8699-9d17d911f14c" path="/var/lib/kubelet/pods/a20e4421-fec8-4f5a-8699-9d17d911f14c/volumes" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.760070 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.787214 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qrhg\" (UniqueName: \"kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.787315 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.787444 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.788850 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.788896 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.789023 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.789192 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key\") pod \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\" (UID: \"7b25bdab-8c46-43b8-be48-0e3df0f48c57\") " Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.789734 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs" (OuterVolumeSpecName: "logs") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.790522 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b25bdab-8c46-43b8-be48-0e3df0f48c57-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.795819 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.796472 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg" (OuterVolumeSpecName: "kube-api-access-8qrhg") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "kube-api-access-8qrhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.826998 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.829364 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts" (OuterVolumeSpecName: "scripts") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.846018 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data" (OuterVolumeSpecName: "config-data") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.892402 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.892439 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qrhg\" (UniqueName: \"kubernetes.io/projected/7b25bdab-8c46-43b8-be48-0e3df0f48c57-kube-api-access-8qrhg\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.892452 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.892461 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.892473 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b25bdab-8c46-43b8-be48-0e3df0f48c57-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.894847 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7b25bdab-8c46-43b8-be48-0e3df0f48c57" (UID: "7b25bdab-8c46-43b8-be48-0e3df0f48c57"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:41 crc kubenswrapper[4696]: I1202 23:03:41.994407 4696 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b25bdab-8c46-43b8-be48-0e3df0f48c57-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.241397 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b448778f6-q69jq" event={"ID":"7b25bdab-8c46-43b8-be48-0e3df0f48c57","Type":"ContainerDied","Data":"b31e42b5e1860bbba4d28ace4da0d4572089dba8db721bab7b4bedbc5618e9b7"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.241450 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b448778f6-q69jq" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.241983 4696 scope.go:117] "RemoveContainer" containerID="d15abc85ab3659d4083f5707c2acb2b3368eb8c46e42748f7c5227e701ce835e" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255135 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerID="ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a" exitCode=0 Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255176 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerID="bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1" exitCode=2 Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255188 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerID="0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad" exitCode=0 Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255269 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerDied","Data":"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255341 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerDied","Data":"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.255352 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerDied","Data":"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.261833 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e0a0bd09-55c1-4eb0-bed1-76a920e67875","Type":"ContainerStarted","Data":"66c5ff3997eb0b8d16931af27ee1584ae03a3f60262b071767248ee7e097c065"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.261905 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e0a0bd09-55c1-4eb0-bed1-76a920e67875","Type":"ContainerStarted","Data":"bf13cb1be4374ecd81574a20a417efbfaae0b86c3db62f54b35abcf682106fb5"} Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.298087 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.311566 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b448778f6-q69jq"] Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.560499 4696 scope.go:117] "RemoveContainer" containerID="db7c41afc2c4141b03c10301aee7cd5a2b37a58c34385a505278ac8b3e2f3bf7" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.712902 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821698 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821805 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821868 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821909 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821948 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.821963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqr2k\" (UniqueName: \"kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.822060 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts\") pod \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\" (UID: \"0c28b24c-165c-4d9e-acb5-9d5e88869fe8\") " Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.824563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.825042 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.834540 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k" (OuterVolumeSpecName: "kube-api-access-vqr2k") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "kube-api-access-vqr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.856853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts" (OuterVolumeSpecName: "scripts") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.877981 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.926078 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.926663 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.926733 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqr2k\" (UniqueName: \"kubernetes.io/projected/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-kube-api-access-vqr2k\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.926836 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.926907 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.971837 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:42 crc kubenswrapper[4696]: I1202 23:03:42.980120 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data" (OuterVolumeSpecName: "config-data") pod "0c28b24c-165c-4d9e-acb5-9d5e88869fe8" (UID: "0c28b24c-165c-4d9e-acb5-9d5e88869fe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.028611 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.028654 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c28b24c-165c-4d9e-acb5-9d5e88869fe8-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.275639 4696 generic.go:334] "Generic (PLEG): container finished" podID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerID="498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc" exitCode=0 Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.275726 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerDied","Data":"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc"} Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.275788 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c28b24c-165c-4d9e-acb5-9d5e88869fe8","Type":"ContainerDied","Data":"0ec9de4819fbab9b6604d25faefbfdf70c4aa9f08d089d6596574c0de7a6d18e"} Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.275812 4696 scope.go:117] "RemoveContainer" containerID="ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.275945 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.278719 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e0a0bd09-55c1-4eb0-bed1-76a920e67875","Type":"ContainerStarted","Data":"5e507d1ae80c7a68d5d7f2a722d64b2c0f85a80386cd7e49bdb6e6301e4b9d74"} Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.279072 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.309855 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.309828752 podStartE2EDuration="3.309828752s" podCreationTimestamp="2025-12-02 23:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:43.301930257 +0000 UTC m=+1286.182610278" watchObservedRunningTime="2025-12-02 23:03:43.309828752 +0000 UTC m=+1286.190508753" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.316562 4696 scope.go:117] "RemoveContainer" containerID="bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.330346 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.348903 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.364300 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365091 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-notification-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365123 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-notification-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365164 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-central-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365179 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-central-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365190 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="sg-core" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365199 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="sg-core" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365224 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365232 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365251 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon-log" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365263 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon-log" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.365292 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="proxy-httpd" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365300 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="proxy-httpd" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365569 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon-log" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365591 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="proxy-httpd" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365605 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="sg-core" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365623 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-central-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365639 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" containerName="horizon" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.365656 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" containerName="ceilometer-notification-agent" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.368678 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.374774 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.374946 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.377066 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.399298 4696 scope.go:117] "RemoveContainer" containerID="0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.433816 4696 scope.go:117] "RemoveContainer" containerID="498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.436314 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.436383 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.436531 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr8dc\" (UniqueName: \"kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.436771 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.436828 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.437022 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.437090 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.445513 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c28b24c-165c-4d9e-acb5-9d5e88869fe8" path="/var/lib/kubelet/pods/0c28b24c-165c-4d9e-acb5-9d5e88869fe8/volumes" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.446990 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b25bdab-8c46-43b8-be48-0e3df0f48c57" path="/var/lib/kubelet/pods/7b25bdab-8c46-43b8-be48-0e3df0f48c57/volumes" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.462561 4696 scope.go:117] "RemoveContainer" containerID="ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.463085 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a\": container with ID starting with ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a not found: ID does not exist" containerID="ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.463135 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a"} err="failed to get container status \"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a\": rpc error: code = NotFound desc = could not find container \"ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a\": container with ID starting with ed1b2a1c5a20ef96446e5c56509e5091051ad82432d30d848a9768f410c62c6a not found: ID does not exist" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.463180 4696 scope.go:117] "RemoveContainer" containerID="bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.463620 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1\": container with ID starting with bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1 not found: ID does not exist" containerID="bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.463643 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1"} err="failed to get container status \"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1\": rpc error: code = NotFound desc = could not find container \"bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1\": container with ID starting with bc829eaaa0830e36bc6ae58b0815fe68bb32d26e81aa41d2189c3a2711035bd1 not found: ID does not exist" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.463656 4696 scope.go:117] "RemoveContainer" containerID="0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.464136 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad\": container with ID starting with 0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad not found: ID does not exist" containerID="0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.464180 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad"} err="failed to get container status \"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad\": rpc error: code = NotFound desc = could not find container \"0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad\": container with ID starting with 0856d99b5c6a1150dbc1681b01519ce7bead8b6cee8339f0096b15e25c7ec3ad not found: ID does not exist" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.464197 4696 scope.go:117] "RemoveContainer" containerID="498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc" Dec 02 23:03:43 crc kubenswrapper[4696]: E1202 23:03:43.464619 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc\": container with ID starting with 498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc not found: ID does not exist" containerID="498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.464663 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc"} err="failed to get container status \"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc\": rpc error: code = NotFound desc = could not find container \"498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc\": container with ID starting with 498c09b6eb57209981cd274d5e0d964235b22cc2b4a97cf141d3d2a1832038cc not found: ID does not exist" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.539927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.539985 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.540053 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.540809 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.540915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.540970 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr8dc\" (UniqueName: \"kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.541053 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.541090 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.541663 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.547389 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.549534 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.549584 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.550816 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.562353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr8dc\" (UniqueName: \"kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc\") pod \"ceilometer-0\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.709959 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:03:43 crc kubenswrapper[4696]: I1202 23:03:43.735037 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:44 crc kubenswrapper[4696]: I1202 23:03:44.215987 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:03:44 crc kubenswrapper[4696]: W1202 23:03:44.221472 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e90434a_fd8d_4ee1_8d14_72e82088a882.slice/crio-e30fec97cbd44fa78f1c9cfb004a61438cfda878e395358d172467ab662c4f26 WatchSource:0}: Error finding container e30fec97cbd44fa78f1c9cfb004a61438cfda878e395358d172467ab662c4f26: Status 404 returned error can't find the container with id e30fec97cbd44fa78f1c9cfb004a61438cfda878e395358d172467ab662c4f26 Dec 02 23:03:44 crc kubenswrapper[4696]: I1202 23:03:44.302478 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerStarted","Data":"e30fec97cbd44fa78f1c9cfb004a61438cfda878e395358d172467ab662c4f26"} Dec 02 23:03:44 crc kubenswrapper[4696]: I1202 23:03:44.453618 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:44 crc kubenswrapper[4696]: I1202 23:03:44.453961 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3ac42795-270d-403e-8622-d7592294ddff" containerName="watcher-decision-engine" containerID="cri-o://7388711da6840e9ba77036b562f6a290654e7e1a0679fa5fa6f2d37c4d7f8816" gracePeriod=30 Dec 02 23:03:46 crc kubenswrapper[4696]: I1202 23:03:46.113765 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:46 crc kubenswrapper[4696]: I1202 23:03:46.114515 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-log" containerID="cri-o://3fc870a0e795fd1c83f2b237de124f02e7d049e74c303b3b058fbcf17f5d2c90" gracePeriod=30 Dec 02 23:03:46 crc kubenswrapper[4696]: I1202 23:03:46.114547 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-httpd" containerID="cri-o://fb059ff59a354b1671b46b5170a0aefda44680c6afc9df05c28ba6ecd892cfbc" gracePeriod=30 Dec 02 23:03:46 crc kubenswrapper[4696]: I1202 23:03:46.322862 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerDied","Data":"3fc870a0e795fd1c83f2b237de124f02e7d049e74c303b3b058fbcf17f5d2c90"} Dec 02 23:03:46 crc kubenswrapper[4696]: I1202 23:03:46.322907 4696 generic.go:334] "Generic (PLEG): container finished" podID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerID="3fc870a0e795fd1c83f2b237de124f02e7d049e74c303b3b058fbcf17f5d2c90" exitCode=143 Dec 02 23:03:49 crc kubenswrapper[4696]: I1202 23:03:49.369980 4696 generic.go:334] "Generic (PLEG): container finished" podID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerID="fb059ff59a354b1671b46b5170a0aefda44680c6afc9df05c28ba6ecd892cfbc" exitCode=0 Dec 02 23:03:49 crc kubenswrapper[4696]: I1202 23:03:49.370466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerDied","Data":"fb059ff59a354b1671b46b5170a0aefda44680c6afc9df05c28ba6ecd892cfbc"} Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.394352 4696 generic.go:334] "Generic (PLEG): container finished" podID="3ac42795-270d-403e-8622-d7592294ddff" containerID="7388711da6840e9ba77036b562f6a290654e7e1a0679fa5fa6f2d37c4d7f8816" exitCode=0 Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.394622 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ac42795-270d-403e-8622-d7592294ddff","Type":"ContainerDied","Data":"7388711da6840e9ba77036b562f6a290654e7e1a0679fa5fa6f2d37c4d7f8816"} Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.459168 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.523054 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-657n6\" (UniqueName: \"kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6\") pod \"3ac42795-270d-403e-8622-d7592294ddff\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.523257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca\") pod \"3ac42795-270d-403e-8622-d7592294ddff\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.523324 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs\") pod \"3ac42795-270d-403e-8622-d7592294ddff\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.523353 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle\") pod \"3ac42795-270d-403e-8622-d7592294ddff\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.523377 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data\") pod \"3ac42795-270d-403e-8622-d7592294ddff\" (UID: \"3ac42795-270d-403e-8622-d7592294ddff\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.532895 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs" (OuterVolumeSpecName: "logs") pod "3ac42795-270d-403e-8622-d7592294ddff" (UID: "3ac42795-270d-403e-8622-d7592294ddff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.541063 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6" (OuterVolumeSpecName: "kube-api-access-657n6") pod "3ac42795-270d-403e-8622-d7592294ddff" (UID: "3ac42795-270d-403e-8622-d7592294ddff"). InnerVolumeSpecName "kube-api-access-657n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.567339 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.581659 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3ac42795-270d-403e-8622-d7592294ddff" (UID: "3ac42795-270d-403e-8622-d7592294ddff"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625465 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625548 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625612 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625765 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp2z4\" (UniqueName: \"kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625796 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625847 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.625967 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.626047 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs\") pod \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\" (UID: \"3f23c971-6815-49d3-b1aa-7eb9e23b0b83\") " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.626526 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-657n6\" (UniqueName: \"kubernetes.io/projected/3ac42795-270d-403e-8622-d7592294ddff-kube-api-access-657n6\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.626546 4696 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.626556 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ac42795-270d-403e-8622-d7592294ddff-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.627506 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.629060 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs" (OuterVolumeSpecName: "logs") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.631853 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ac42795-270d-403e-8622-d7592294ddff" (UID: "3ac42795-270d-403e-8622-d7592294ddff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.639268 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.640945 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts" (OuterVolumeSpecName: "scripts") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.646928 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4" (OuterVolumeSpecName: "kube-api-access-pp2z4") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "kube-api-access-pp2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.686223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data" (OuterVolumeSpecName: "config-data") pod "3ac42795-270d-403e-8622-d7592294ddff" (UID: "3ac42795-270d-403e-8622-d7592294ddff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.731803 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732121 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732217 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732337 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732423 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac42795-270d-403e-8622-d7592294ddff-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732499 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp2z4\" (UniqueName: \"kubernetes.io/projected/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-kube-api-access-pp2z4\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.732579 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.735398 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data" (OuterVolumeSpecName: "config-data") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.740454 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.774532 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.785470 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3f23c971-6815-49d3-b1aa-7eb9e23b0b83" (UID: "3f23c971-6815-49d3-b1aa-7eb9e23b0b83"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.835025 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.835087 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.835098 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:50 crc kubenswrapper[4696]: I1202 23:03:50.835113 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f23c971-6815-49d3-b1aa-7eb9e23b0b83-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.409408 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f23c971-6815-49d3-b1aa-7eb9e23b0b83","Type":"ContainerDied","Data":"c122c41870303f1e7fbefb18bf35dcdb71c1c717ec6a3131a8e6730b60d07a3b"} Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.409922 4696 scope.go:117] "RemoveContainer" containerID="fb059ff59a354b1671b46b5170a0aefda44680c6afc9df05c28ba6ecd892cfbc" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.409470 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.412560 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89vr8" event={"ID":"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27","Type":"ContainerStarted","Data":"83c9852211162805e83ff9321f542ffdd13012ef0a1e8f10ee979a73dc9fa17e"} Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.416329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ac42795-270d-403e-8622-d7592294ddff","Type":"ContainerDied","Data":"e7bed1f3c2c5569f7f5929fd065d999067d2c74c2134ef9f04800446c905ffcd"} Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.416342 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.419577 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerStarted","Data":"e611452139f2b7a21a201c54d74e4a891a0ea1668f57ad1b506741fa930b5441"} Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.419624 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerStarted","Data":"29a240238e5d3c2838168cfdfab98f076907c8679a425eca5fa3f49d70da1584"} Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.444000 4696 scope.go:117] "RemoveContainer" containerID="3fc870a0e795fd1c83f2b237de124f02e7d049e74c303b3b058fbcf17f5d2c90" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.466282 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-89vr8" podStartSLOduration=2.950789243 podStartE2EDuration="14.466260524s" podCreationTimestamp="2025-12-02 23:03:37 +0000 UTC" firstStartedPulling="2025-12-02 23:03:38.655399126 +0000 UTC m=+1281.536079127" lastFinishedPulling="2025-12-02 23:03:50.170870407 +0000 UTC m=+1293.051550408" observedRunningTime="2025-12-02 23:03:51.439661639 +0000 UTC m=+1294.320341650" watchObservedRunningTime="2025-12-02 23:03:51.466260524 +0000 UTC m=+1294.346940515" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.483706 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.483897 4696 scope.go:117] "RemoveContainer" containerID="7388711da6840e9ba77036b562f6a290654e7e1a0679fa5fa6f2d37c4d7f8816" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.546083 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.570469 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: E1202 23:03:51.572122 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-httpd" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.572266 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-httpd" Dec 02 23:03:51 crc kubenswrapper[4696]: E1202 23:03:51.572361 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-log" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.572424 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-log" Dec 02 23:03:51 crc kubenswrapper[4696]: E1202 23:03:51.572526 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac42795-270d-403e-8622-d7592294ddff" containerName="watcher-decision-engine" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.572589 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac42795-270d-403e-8622-d7592294ddff" containerName="watcher-decision-engine" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.573571 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-log" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.573656 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" containerName="glance-httpd" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.573730 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac42795-270d-403e-8622-d7592294ddff" containerName="watcher-decision-engine" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.577318 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.577649 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.586726 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.587120 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.612451 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.633907 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.648799 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.650298 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.654921 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.663622 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678152 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678181 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678200 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678330 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678355 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.678389 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bnb\" (UniqueName: \"kubernetes.io/projected/7d71c8ce-55a6-4bbc-a450-128443762f36-kube-api-access-k2bnb\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.779957 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780029 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780082 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bnb\" (UniqueName: \"kubernetes.io/projected/7d71c8ce-55a6-4bbc-a450-128443762f36-kube-api-access-k2bnb\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780124 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780171 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780210 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mn5\" (UniqueName: \"kubernetes.io/projected/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-kube-api-access-l5mn5\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780377 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-logs\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780463 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780501 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780532 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780556 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780568 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780586 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780696 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780725 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.780903 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d71c8ce-55a6-4bbc-a450-128443762f36-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.789978 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.790080 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.790386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.791372 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d71c8ce-55a6-4bbc-a450-128443762f36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.807476 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bnb\" (UniqueName: \"kubernetes.io/projected/7d71c8ce-55a6-4bbc-a450-128443762f36-kube-api-access-k2bnb\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.825036 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d71c8ce-55a6-4bbc-a450-128443762f36\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.883105 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.883168 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mn5\" (UniqueName: \"kubernetes.io/projected/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-kube-api-access-l5mn5\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.883199 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-logs\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.883232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.883268 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.884197 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-logs\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.888067 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.888463 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.888675 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.902887 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mn5\" (UniqueName: \"kubernetes.io/projected/9703e3e9-39e6-4c7f-a1ea-324a4f26c18a-kube-api-access-l5mn5\") pod \"watcher-decision-engine-0\" (UID: \"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.921446 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:03:51 crc kubenswrapper[4696]: I1202 23:03:51.967467 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:03:52 crc kubenswrapper[4696]: I1202 23:03:52.464864 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerStarted","Data":"c0c896373acc60716fa08df85b43ec19fa090f24d6d53ea3a8160c6059e91ab3"} Dec 02 23:03:52 crc kubenswrapper[4696]: I1202 23:03:52.610588 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:03:52 crc kubenswrapper[4696]: I1202 23:03:52.660060 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.200504 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.458361 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac42795-270d-403e-8622-d7592294ddff" path="/var/lib/kubelet/pods/3ac42795-270d-403e-8622-d7592294ddff/volumes" Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.459666 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f23c971-6815-49d3-b1aa-7eb9e23b0b83" path="/var/lib/kubelet/pods/3f23c971-6815-49d3-b1aa-7eb9e23b0b83/volumes" Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.489625 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a","Type":"ContainerStarted","Data":"6b932f7776dcefa38a582609cb39709984f26c19deb0d99732470c51466d57de"} Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.489695 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9703e3e9-39e6-4c7f-a1ea-324a4f26c18a","Type":"ContainerStarted","Data":"d708f6374340113b7600007410418a6dc1ba8e3b1db6e844c3ab68fb66f8ba34"} Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.497148 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d71c8ce-55a6-4bbc-a450-128443762f36","Type":"ContainerStarted","Data":"d1b65834dde286987d61998245aa41f7aadf314f4b19ec713e005314e893cc8a"} Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.497202 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d71c8ce-55a6-4bbc-a450-128443762f36","Type":"ContainerStarted","Data":"9583127775c05c292f0bef24ba4be61df0ee68c29a81be19fc2edcbdbdb022d4"} Dec 02 23:03:53 crc kubenswrapper[4696]: I1202 23:03:53.510484 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.510467394 podStartE2EDuration="2.510467394s" podCreationTimestamp="2025-12-02 23:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:53.509119276 +0000 UTC m=+1296.389799297" watchObservedRunningTime="2025-12-02 23:03:53.510467394 +0000 UTC m=+1296.391147395" Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.533389 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerStarted","Data":"fad1e494d742d1c5832ec533e4e210f54027401b127b62d73bc4e20a440d1b29"} Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.534152 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.533713 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-central-agent" containerID="cri-o://29a240238e5d3c2838168cfdfab98f076907c8679a425eca5fa3f49d70da1584" gracePeriod=30 Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.534472 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="sg-core" containerID="cri-o://c0c896373acc60716fa08df85b43ec19fa090f24d6d53ea3a8160c6059e91ab3" gracePeriod=30 Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.534632 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="proxy-httpd" containerID="cri-o://fad1e494d742d1c5832ec533e4e210f54027401b127b62d73bc4e20a440d1b29" gracePeriod=30 Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.534720 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-notification-agent" containerID="cri-o://e611452139f2b7a21a201c54d74e4a891a0ea1668f57ad1b506741fa930b5441" gracePeriod=30 Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.536919 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d71c8ce-55a6-4bbc-a450-128443762f36","Type":"ContainerStarted","Data":"ce005d3b3e670f3c849d4b040a1d2a384aa4e876554865308264f674049117d1"} Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.592147 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9471384330000001 podStartE2EDuration="11.592121985s" podCreationTimestamp="2025-12-02 23:03:43 +0000 UTC" firstStartedPulling="2025-12-02 23:03:44.224911944 +0000 UTC m=+1287.105591945" lastFinishedPulling="2025-12-02 23:03:53.869895496 +0000 UTC m=+1296.750575497" observedRunningTime="2025-12-02 23:03:54.56341475 +0000 UTC m=+1297.444094751" watchObservedRunningTime="2025-12-02 23:03:54.592121985 +0000 UTC m=+1297.472801986" Dec 02 23:03:54 crc kubenswrapper[4696]: I1202 23:03:54.614068 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.614034717 podStartE2EDuration="3.614034717s" podCreationTimestamp="2025-12-02 23:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:54.603195619 +0000 UTC m=+1297.483875620" watchObservedRunningTime="2025-12-02 23:03:54.614034717 +0000 UTC m=+1297.494714718" Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.551899 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerID="fad1e494d742d1c5832ec533e4e210f54027401b127b62d73bc4e20a440d1b29" exitCode=0 Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.552400 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerID="c0c896373acc60716fa08df85b43ec19fa090f24d6d53ea3a8160c6059e91ab3" exitCode=2 Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.551982 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerDied","Data":"fad1e494d742d1c5832ec533e4e210f54027401b127b62d73bc4e20a440d1b29"} Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.552458 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerDied","Data":"c0c896373acc60716fa08df85b43ec19fa090f24d6d53ea3a8160c6059e91ab3"} Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.552475 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerDied","Data":"e611452139f2b7a21a201c54d74e4a891a0ea1668f57ad1b506741fa930b5441"} Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.552412 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerID="e611452139f2b7a21a201c54d74e4a891a0ea1668f57ad1b506741fa930b5441" exitCode=0 Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.589779 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.590117 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-log" containerID="cri-o://b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a" gracePeriod=30 Dec 02 23:03:55 crc kubenswrapper[4696]: I1202 23:03:55.590181 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-httpd" containerID="cri-o://22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3" gracePeriod=30 Dec 02 23:03:56 crc kubenswrapper[4696]: I1202 23:03:56.566998 4696 generic.go:334] "Generic (PLEG): container finished" podID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerID="b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a" exitCode=143 Dec 02 23:03:56 crc kubenswrapper[4696]: I1202 23:03:56.567095 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerDied","Data":"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a"} Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.500928 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.612930 4696 generic.go:334] "Generic (PLEG): container finished" podID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerID="22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3" exitCode=0 Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.612990 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerDied","Data":"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3"} Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.613023 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3","Type":"ContainerDied","Data":"35e9f26aa4e995677cef3022a22a657b76e7e90d09c9d964b9672882b0e5c3fe"} Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.613045 4696 scope.go:117] "RemoveContainer" containerID="22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.613249 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.647721 4696 scope.go:117] "RemoveContainer" containerID="b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664410 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664654 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664707 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664778 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664796 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664882 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664906 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.664939 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjkgl\" (UniqueName: \"kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl\") pod \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\" (UID: \"6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3\") " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.665706 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs" (OuterVolumeSpecName: "logs") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.666112 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.672453 4696 scope.go:117] "RemoveContainer" containerID="22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.672552 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts" (OuterVolumeSpecName: "scripts") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: E1202 23:03:59.673120 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3\": container with ID starting with 22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3 not found: ID does not exist" containerID="22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.673166 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl" (OuterVolumeSpecName: "kube-api-access-pjkgl") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "kube-api-access-pjkgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.673171 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3"} err="failed to get container status \"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3\": rpc error: code = NotFound desc = could not find container \"22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3\": container with ID starting with 22473735429ac296019a3825fa9930c23f12a28fb1006c66c509b5c9e3458ae3 not found: ID does not exist" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.673215 4696 scope.go:117] "RemoveContainer" containerID="b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a" Dec 02 23:03:59 crc kubenswrapper[4696]: E1202 23:03:59.676125 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a\": container with ID starting with b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a not found: ID does not exist" containerID="b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.676197 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a"} err="failed to get container status \"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a\": rpc error: code = NotFound desc = could not find container \"b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a\": container with ID starting with b33900e709c3fcbee9e423fd94b361b7475a1131ec2686c3cb3aa18f42bb5e1a not found: ID does not exist" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.676385 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.704572 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.737010 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.759666 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data" (OuterVolumeSpecName: "config-data") pod "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" (UID: "6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767350 4696 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767389 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767404 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767456 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767481 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjkgl\" (UniqueName: \"kubernetes.io/projected/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-kube-api-access-pjkgl\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767494 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767504 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.767513 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.788984 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.869258 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.952662 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.962601 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.985903 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:03:59 crc kubenswrapper[4696]: E1202 23:03:59.986524 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-httpd" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.986544 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-httpd" Dec 02 23:03:59 crc kubenswrapper[4696]: E1202 23:03:59.986581 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-log" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.986589 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-log" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.986822 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-httpd" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.986845 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" containerName="glance-log" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.988068 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.990903 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.991368 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:03:59 crc kubenswrapper[4696]: I1202 23:03:59.997545 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:04:00 crc kubenswrapper[4696]: E1202 23:04:00.075282 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef01fc5_d6fd_48bd_aa11_4cf840a2d6e3.slice/crio-35e9f26aa4e995677cef3022a22a657b76e7e90d09c9d964b9672882b0e5c3fe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef01fc5_d6fd_48bd_aa11_4cf840a2d6e3.slice\": RecentStats: unable to find data in memory cache]" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177643 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-logs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177783 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177832 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177879 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.177940 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.178312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.178437 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt4n\" (UniqueName: \"kubernetes.io/projected/f3e14f14-0774-4eb2-aff3-231d72e6136f-kube-api-access-8bt4n\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284455 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-logs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284537 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284637 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284685 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284792 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.284983 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.285043 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt4n\" (UniqueName: \"kubernetes.io/projected/f3e14f14-0774-4eb2-aff3-231d72e6136f-kube-api-access-8bt4n\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.285131 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.285371 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.285366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e14f14-0774-4eb2-aff3-231d72e6136f-logs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.285585 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.293434 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.293588 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.294670 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.294845 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e14f14-0774-4eb2-aff3-231d72e6136f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.305780 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt4n\" (UniqueName: \"kubernetes.io/projected/f3e14f14-0774-4eb2-aff3-231d72e6136f-kube-api-access-8bt4n\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.320053 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f3e14f14-0774-4eb2-aff3-231d72e6136f\") " pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.354954 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:04:00 crc kubenswrapper[4696]: I1202 23:04:00.938350 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:04:00 crc kubenswrapper[4696]: W1202 23:04:00.942331 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3e14f14_0774_4eb2_aff3_231d72e6136f.slice/crio-0b0030a9175e253e4b5d90f3c2ebd3b1b3fb9ecb8903ed9fb02fc0317072657d WatchSource:0}: Error finding container 0b0030a9175e253e4b5d90f3c2ebd3b1b3fb9ecb8903ed9fb02fc0317072657d: Status 404 returned error can't find the container with id 0b0030a9175e253e4b5d90f3c2ebd3b1b3fb9ecb8903ed9fb02fc0317072657d Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.444135 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3" path="/var/lib/kubelet/pods/6ef01fc5-d6fd-48bd-aa11-4cf840a2d6e3/volumes" Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.639865 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3e14f14-0774-4eb2-aff3-231d72e6136f","Type":"ContainerStarted","Data":"768b0103bf5a2cb8163b065538b9983fcb3e66ced410d13db72af332df166c7f"} Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.639925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3e14f14-0774-4eb2-aff3-231d72e6136f","Type":"ContainerStarted","Data":"0b0030a9175e253e4b5d90f3c2ebd3b1b3fb9ecb8903ed9fb02fc0317072657d"} Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.922178 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.922256 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.968335 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.988954 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:01 crc kubenswrapper[4696]: I1202 23:04:01.992609 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.008887 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.654714 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3e14f14-0774-4eb2-aff3-231d72e6136f","Type":"ContainerStarted","Data":"9200610a9337c7cbecf5558ec8993968031195383f713674811c40d1703dd015"} Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.657147 4696 generic.go:334] "Generic (PLEG): container finished" podID="8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" containerID="83c9852211162805e83ff9321f542ffdd13012ef0a1e8f10ee979a73dc9fa17e" exitCode=0 Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.657231 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89vr8" event={"ID":"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27","Type":"ContainerDied","Data":"83c9852211162805e83ff9321f542ffdd13012ef0a1e8f10ee979a73dc9fa17e"} Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.658629 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.661047 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.661123 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.690777 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.690755484 podStartE2EDuration="3.690755484s" podCreationTimestamp="2025-12-02 23:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:02.687498722 +0000 UTC m=+1305.568178723" watchObservedRunningTime="2025-12-02 23:04:02.690755484 +0000 UTC m=+1305.571435485" Dec 02 23:04:02 crc kubenswrapper[4696]: I1202 23:04:02.713332 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 23:04:03 crc kubenswrapper[4696]: I1202 23:04:03.675875 4696 generic.go:334] "Generic (PLEG): container finished" podID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerID="29a240238e5d3c2838168cfdfab98f076907c8679a425eca5fa3f49d70da1584" exitCode=0 Dec 02 23:04:03 crc kubenswrapper[4696]: I1202 23:04:03.676360 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerDied","Data":"29a240238e5d3c2838168cfdfab98f076907c8679a425eca5fa3f49d70da1584"} Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.013428 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088595 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088755 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr8dc\" (UniqueName: \"kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088856 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088894 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088919 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.088954 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.089308 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd\") pod \"6e90434a-fd8d-4ee1-8d14-72e82088a882\" (UID: \"6e90434a-fd8d-4ee1-8d14-72e82088a882\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.090221 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.090604 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.104057 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc" (OuterVolumeSpecName: "kube-api-access-tr8dc") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "kube-api-access-tr8dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.104071 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts" (OuterVolumeSpecName: "scripts") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.142268 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.164146 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.192645 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.192686 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.192694 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.192702 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e90434a-fd8d-4ee1-8d14-72e82088a882-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.192713 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr8dc\" (UniqueName: \"kubernetes.io/projected/6e90434a-fd8d-4ee1-8d14-72e82088a882-kube-api-access-tr8dc\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.194143 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.218655 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data" (OuterVolumeSpecName: "config-data") pod "6e90434a-fd8d-4ee1-8d14-72e82088a882" (UID: "6e90434a-fd8d-4ee1-8d14-72e82088a882"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.294658 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbbn9\" (UniqueName: \"kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9\") pod \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.294812 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle\") pod \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.294883 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts\") pod \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.294934 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data\") pod \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\" (UID: \"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27\") " Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.295607 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.295631 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e90434a-fd8d-4ee1-8d14-72e82088a882-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.298822 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9" (OuterVolumeSpecName: "kube-api-access-vbbn9") pod "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" (UID: "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27"). InnerVolumeSpecName "kube-api-access-vbbn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.300034 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts" (OuterVolumeSpecName: "scripts") pod "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" (UID: "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.323760 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data" (OuterVolumeSpecName: "config-data") pod "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" (UID: "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.326445 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" (UID: "8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.397644 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbbn9\" (UniqueName: \"kubernetes.io/projected/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-kube-api-access-vbbn9\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.398020 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.398083 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.398166 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.690960 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e90434a-fd8d-4ee1-8d14-72e82088a882","Type":"ContainerDied","Data":"e30fec97cbd44fa78f1c9cfb004a61438cfda878e395358d172467ab662c4f26"} Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.691493 4696 scope.go:117] "RemoveContainer" containerID="fad1e494d742d1c5832ec533e4e210f54027401b127b62d73bc4e20a440d1b29" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.690988 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.696782 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89vr8" event={"ID":"8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27","Type":"ContainerDied","Data":"c8df4d8a1e40c8f59719a7463d12508117240c17deef749eb942910177ab27d5"} Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.697019 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8df4d8a1e40c8f59719a7463d12508117240c17deef749eb942910177ab27d5" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.697717 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89vr8" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.700907 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.700974 4696 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.738948 4696 scope.go:117] "RemoveContainer" containerID="c0c896373acc60716fa08df85b43ec19fa090f24d6d53ea3a8160c6059e91ab3" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.739991 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.751102 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.771353 4696 scope.go:117] "RemoveContainer" containerID="e611452139f2b7a21a201c54d74e4a891a0ea1668f57ad1b506741fa930b5441" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815037 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.815582 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-notification-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815601 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-notification-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.815615 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" containerName="nova-cell0-conductor-db-sync" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815622 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" containerName="nova-cell0-conductor-db-sync" Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.815632 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-central-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815640 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-central-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.815650 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="proxy-httpd" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815660 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="proxy-httpd" Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.815674 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="sg-core" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815679 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="sg-core" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815883 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="proxy-httpd" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815904 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="sg-core" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815912 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-notification-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815928 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" containerName="ceilometer-central-agent" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.815937 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" containerName="nova-cell0-conductor-db-sync" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.817849 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.826317 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.829363 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.829606 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.834108 4696 scope.go:117] "RemoveContainer" containerID="29a240238e5d3c2838168cfdfab98f076907c8679a425eca5fa3f49d70da1584" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.888286 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.890502 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.892415 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.895365 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cgfs4" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.895710 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.912663 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zbb\" (UniqueName: \"kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.912823 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.912939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.912980 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.913152 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.913221 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.913255 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.918869 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: I1202 23:04:04.930497 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:04 crc kubenswrapper[4696]: E1202 23:04:04.931819 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-7qhvq], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data kube-api-access-7qhvq]: context canceled" pod="openstack/nova-cell0-conductor-0" podUID="867ad5bc-c92f-45bd-8b7e-10a78c234d89" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.012238 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015068 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015120 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015188 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zbb\" (UniqueName: \"kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015241 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qhvq\" (UniqueName: \"kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015274 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015316 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015343 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015456 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.015515 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.016891 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.017151 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.022993 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.023143 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.024851 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.025174 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.045866 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zbb\" (UniqueName: \"kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb\") pod \"ceilometer-0\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.117191 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qhvq\" (UniqueName: \"kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.117285 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.117335 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.124353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.124581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.143899 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qhvq\" (UniqueName: \"kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq\") pod \"nova-cell0-conductor-0\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.170697 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.451169 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e90434a-fd8d-4ee1-8d14-72e82088a882" path="/var/lib/kubelet/pods/6e90434a-fd8d-4ee1-8d14-72e82088a882/volumes" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.715236 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.723012 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.762336 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.834660 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data\") pod \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.834827 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle\") pod \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.835014 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qhvq\" (UniqueName: \"kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq\") pod \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\" (UID: \"867ad5bc-c92f-45bd-8b7e-10a78c234d89\") " Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.854664 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq" (OuterVolumeSpecName: "kube-api-access-7qhvq") pod "867ad5bc-c92f-45bd-8b7e-10a78c234d89" (UID: "867ad5bc-c92f-45bd-8b7e-10a78c234d89"). InnerVolumeSpecName "kube-api-access-7qhvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.863650 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "867ad5bc-c92f-45bd-8b7e-10a78c234d89" (UID: "867ad5bc-c92f-45bd-8b7e-10a78c234d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.864038 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data" (OuterVolumeSpecName: "config-data") pod "867ad5bc-c92f-45bd-8b7e-10a78c234d89" (UID: "867ad5bc-c92f-45bd-8b7e-10a78c234d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.938268 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qhvq\" (UniqueName: \"kubernetes.io/projected/867ad5bc-c92f-45bd-8b7e-10a78c234d89-kube-api-access-7qhvq\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.938420 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:05 crc kubenswrapper[4696]: I1202 23:04:05.938436 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad5bc-c92f-45bd-8b7e-10a78c234d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.734612 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerStarted","Data":"40a6ba9e4f2ff6e18c76f10f150b058d7c307b26d71ac488de3ed8d6ce074eb6"} Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.735171 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerStarted","Data":"eee993b4fa2133891272428d411d6af7d744e9dcbb0a144ddc25d1df2f4ee9be"} Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.734988 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.813258 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.839980 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.850315 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.851933 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.856970 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cgfs4" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.858057 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.859429 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.967082 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mzl\" (UniqueName: \"kubernetes.io/projected/5f458554-1460-4379-95af-2313d4df2320-kube-api-access-l5mzl\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.967183 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:06 crc kubenswrapper[4696]: I1202 23:04:06.967228 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.050719 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.068783 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mzl\" (UniqueName: \"kubernetes.io/projected/5f458554-1460-4379-95af-2313d4df2320-kube-api-access-l5mzl\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.068880 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.068915 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.074135 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.075473 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f458554-1460-4379-95af-2313d4df2320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.090241 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mzl\" (UniqueName: \"kubernetes.io/projected/5f458554-1460-4379-95af-2313d4df2320-kube-api-access-l5mzl\") pod \"nova-cell0-conductor-0\" (UID: \"5f458554-1460-4379-95af-2313d4df2320\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.180588 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.452909 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867ad5bc-c92f-45bd-8b7e-10a78c234d89" path="/var/lib/kubelet/pods/867ad5bc-c92f-45bd-8b7e-10a78c234d89/volumes" Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.713493 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:04:07 crc kubenswrapper[4696]: W1202 23:04:07.730036 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f458554_1460_4379_95af_2313d4df2320.slice/crio-2928d639f6416fdf44ab94df98335f03f492c262b88b21130a8bb2b3ddb7e561 WatchSource:0}: Error finding container 2928d639f6416fdf44ab94df98335f03f492c262b88b21130a8bb2b3ddb7e561: Status 404 returned error can't find the container with id 2928d639f6416fdf44ab94df98335f03f492c262b88b21130a8bb2b3ddb7e561 Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.745919 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f458554-1460-4379-95af-2313d4df2320","Type":"ContainerStarted","Data":"2928d639f6416fdf44ab94df98335f03f492c262b88b21130a8bb2b3ddb7e561"} Dec 02 23:04:07 crc kubenswrapper[4696]: I1202 23:04:07.752448 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerStarted","Data":"00c5154ccf67e9ff6ab23c50e54bd235997d7ea7b3efbd7f3de63d8742639f75"} Dec 02 23:04:08 crc kubenswrapper[4696]: I1202 23:04:08.765886 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f458554-1460-4379-95af-2313d4df2320","Type":"ContainerStarted","Data":"38c32adc4bc2b07be5131a3bf2509616ed97bbf76241ed5be9b075960ba8e3e6"} Dec 02 23:04:08 crc kubenswrapper[4696]: I1202 23:04:08.766776 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:08 crc kubenswrapper[4696]: I1202 23:04:08.768875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerStarted","Data":"a212426c6ab38cfc14b34f732447e79a4eaddf537b9613b412d9358958dff5c0"} Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.355298 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.359123 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.395229 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.400713 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.429604 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=4.429580787 podStartE2EDuration="4.429580787s" podCreationTimestamp="2025-12-02 23:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:08.793244035 +0000 UTC m=+1311.673924036" watchObservedRunningTime="2025-12-02 23:04:10.429580787 +0000 UTC m=+1313.310260788" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.795587 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerStarted","Data":"77865a9873cb439b8707070df483057f367e80b16daa6fc022efd5b0f40ad78b"} Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.795963 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.796012 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.796071 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="sg-core" containerID="cri-o://a212426c6ab38cfc14b34f732447e79a4eaddf537b9613b412d9358958dff5c0" gracePeriod=30 Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.796069 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-notification-agent" containerID="cri-o://00c5154ccf67e9ff6ab23c50e54bd235997d7ea7b3efbd7f3de63d8742639f75" gracePeriod=30 Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.796191 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="proxy-httpd" containerID="cri-o://77865a9873cb439b8707070df483057f367e80b16daa6fc022efd5b0f40ad78b" gracePeriod=30 Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.796377 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-central-agent" containerID="cri-o://40a6ba9e4f2ff6e18c76f10f150b058d7c307b26d71ac488de3ed8d6ce074eb6" gracePeriod=30 Dec 02 23:04:10 crc kubenswrapper[4696]: I1202 23:04:10.831990 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.941950725 podStartE2EDuration="6.831959232s" podCreationTimestamp="2025-12-02 23:04:04 +0000 UTC" firstStartedPulling="2025-12-02 23:04:05.731652781 +0000 UTC m=+1308.612332782" lastFinishedPulling="2025-12-02 23:04:09.621661298 +0000 UTC m=+1312.502341289" observedRunningTime="2025-12-02 23:04:10.822196464 +0000 UTC m=+1313.702876465" watchObservedRunningTime="2025-12-02 23:04:10.831959232 +0000 UTC m=+1313.712639223" Dec 02 23:04:11 crc kubenswrapper[4696]: I1202 23:04:11.811394 4696 generic.go:334] "Generic (PLEG): container finished" podID="755cc301-5dfc-49ee-9e12-e98be185d281" containerID="77865a9873cb439b8707070df483057f367e80b16daa6fc022efd5b0f40ad78b" exitCode=0 Dec 02 23:04:11 crc kubenswrapper[4696]: I1202 23:04:11.811506 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerDied","Data":"77865a9873cb439b8707070df483057f367e80b16daa6fc022efd5b0f40ad78b"} Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.744452 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.760169 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.844037 4696 generic.go:334] "Generic (PLEG): container finished" podID="755cc301-5dfc-49ee-9e12-e98be185d281" containerID="a212426c6ab38cfc14b34f732447e79a4eaddf537b9613b412d9358958dff5c0" exitCode=2 Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.844090 4696 generic.go:334] "Generic (PLEG): container finished" podID="755cc301-5dfc-49ee-9e12-e98be185d281" containerID="00c5154ccf67e9ff6ab23c50e54bd235997d7ea7b3efbd7f3de63d8742639f75" exitCode=0 Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.844124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerDied","Data":"a212426c6ab38cfc14b34f732447e79a4eaddf537b9613b412d9358958dff5c0"} Dec 02 23:04:12 crc kubenswrapper[4696]: I1202 23:04:12.844195 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerDied","Data":"00c5154ccf67e9ff6ab23c50e54bd235997d7ea7b3efbd7f3de63d8742639f75"} Dec 02 23:04:13 crc kubenswrapper[4696]: I1202 23:04:13.873648 4696 generic.go:334] "Generic (PLEG): container finished" podID="755cc301-5dfc-49ee-9e12-e98be185d281" containerID="40a6ba9e4f2ff6e18c76f10f150b058d7c307b26d71ac488de3ed8d6ce074eb6" exitCode=0 Dec 02 23:04:13 crc kubenswrapper[4696]: I1202 23:04:13.873763 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerDied","Data":"40a6ba9e4f2ff6e18c76f10f150b058d7c307b26d71ac488de3ed8d6ce074eb6"} Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.211385 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.340728 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82zbb\" (UniqueName: \"kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341031 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341090 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341136 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341170 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341293 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341338 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle\") pod \"755cc301-5dfc-49ee-9e12-e98be185d281\" (UID: \"755cc301-5dfc-49ee-9e12-e98be185d281\") " Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341608 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.341645 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.343501 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.343698 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/755cc301-5dfc-49ee-9e12-e98be185d281-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.352031 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb" (OuterVolumeSpecName: "kube-api-access-82zbb") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "kube-api-access-82zbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.358100 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts" (OuterVolumeSpecName: "scripts") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.386842 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.442071 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.447809 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.448106 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.448255 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82zbb\" (UniqueName: \"kubernetes.io/projected/755cc301-5dfc-49ee-9e12-e98be185d281-kube-api-access-82zbb\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.448392 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.472201 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data" (OuterVolumeSpecName: "config-data") pod "755cc301-5dfc-49ee-9e12-e98be185d281" (UID: "755cc301-5dfc-49ee-9e12-e98be185d281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.550543 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755cc301-5dfc-49ee-9e12-e98be185d281-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.893163 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"755cc301-5dfc-49ee-9e12-e98be185d281","Type":"ContainerDied","Data":"eee993b4fa2133891272428d411d6af7d744e9dcbb0a144ddc25d1df2f4ee9be"} Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.893255 4696 scope.go:117] "RemoveContainer" containerID="77865a9873cb439b8707070df483057f367e80b16daa6fc022efd5b0f40ad78b" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.893279 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.964908 4696 scope.go:117] "RemoveContainer" containerID="a212426c6ab38cfc14b34f732447e79a4eaddf537b9613b412d9358958dff5c0" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.971223 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.980736 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.996518 4696 scope.go:117] "RemoveContainer" containerID="00c5154ccf67e9ff6ab23c50e54bd235997d7ea7b3efbd7f3de63d8742639f75" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.997629 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:14 crc kubenswrapper[4696]: E1202 23:04:14.998117 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="proxy-httpd" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998137 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="proxy-httpd" Dec 02 23:04:14 crc kubenswrapper[4696]: E1202 23:04:14.998150 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-notification-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998158 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-notification-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: E1202 23:04:14.998187 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="sg-core" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998194 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="sg-core" Dec 02 23:04:14 crc kubenswrapper[4696]: E1202 23:04:14.998204 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-central-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998210 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-central-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998416 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="sg-core" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998437 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-central-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998446 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="ceilometer-notification-agent" Dec 02 23:04:14 crc kubenswrapper[4696]: I1202 23:04:14.998455 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" containerName="proxy-httpd" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.001144 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.005498 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.005755 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.018852 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.043836 4696 scope.go:117] "RemoveContainer" containerID="40a6ba9e4f2ff6e18c76f10f150b058d7c307b26d71ac488de3ed8d6ce074eb6" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.062847 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nqp\" (UniqueName: \"kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.062905 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.063107 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.063190 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.063222 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.063289 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.063366 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.165869 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nqp\" (UniqueName: \"kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166640 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166699 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166734 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166836 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.166909 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.167272 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.167772 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.172524 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.180026 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.183939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.184207 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.186851 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nqp\" (UniqueName: \"kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp\") pod \"ceilometer-0\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.329894 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.454055 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755cc301-5dfc-49ee-9e12-e98be185d281" path="/var/lib/kubelet/pods/755cc301-5dfc-49ee-9e12-e98be185d281/volumes" Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.799516 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:15 crc kubenswrapper[4696]: W1202 23:04:15.803703 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30aba68b_e85d_42d1_819a_2b8cb0886659.slice/crio-9ff4a1e8e3a4b226aa43cba6435d5ed07dcfaf0b2d04b37fa3787e4e1ff1af77 WatchSource:0}: Error finding container 9ff4a1e8e3a4b226aa43cba6435d5ed07dcfaf0b2d04b37fa3787e4e1ff1af77: Status 404 returned error can't find the container with id 9ff4a1e8e3a4b226aa43cba6435d5ed07dcfaf0b2d04b37fa3787e4e1ff1af77 Dec 02 23:04:15 crc kubenswrapper[4696]: I1202 23:04:15.915932 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerStarted","Data":"9ff4a1e8e3a4b226aa43cba6435d5ed07dcfaf0b2d04b37fa3787e4e1ff1af77"} Dec 02 23:04:16 crc kubenswrapper[4696]: I1202 23:04:16.929212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerStarted","Data":"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9"} Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.231720 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.731778 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gfgfp"] Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.733796 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.740158 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.741034 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.742841 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gfgfp"] Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.833919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.834400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.834463 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kt6l\" (UniqueName: \"kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.834538 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.909944 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.912074 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.916805 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.930266 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.939376 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.939444 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.939500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kt6l\" (UniqueName: \"kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.939553 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.955597 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:17 crc kubenswrapper[4696]: I1202 23:04:17.970716 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.023265 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.169119 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerStarted","Data":"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002"} Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.250934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.253947 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kt6l\" (UniqueName: \"kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l\") pod \"nova-cell0-cell-mapping-gfgfp\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.259066 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjtb\" (UniqueName: \"kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.259235 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.259311 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.298921 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.300752 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.317732 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.329725 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.331981 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.341305 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.356207 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.371682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.371758 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.371816 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.371857 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.371883 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjtb\" (UniqueName: \"kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.376589 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.377554 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.384935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.399212 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.413416 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjtb\" (UniqueName: \"kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb\") pod \"nova-api-0\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.431216 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.439858 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.461815 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.473682 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.473794 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m5k9\" (UniqueName: \"kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.473827 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6s6l\" (UniqueName: \"kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.473864 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.473881 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.474380 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.474401 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.499891 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.501165 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.524996 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.526098 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.569493 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579325 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zb6\" (UniqueName: \"kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579402 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m5k9\" (UniqueName: \"kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579442 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6s6l\" (UniqueName: \"kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579480 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579506 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579574 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579592 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579612 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579673 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579701 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.579755 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.581631 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.592146 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.595657 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.596355 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.599883 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.615083 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m5k9\" (UniqueName: \"kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9\") pod \"nova-cell1-novncproxy-0\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.622466 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6s6l\" (UniqueName: \"kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l\") pod \"nova-metadata-0\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.634452 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.684825 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zb6\" (UniqueName: \"kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.685524 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.685706 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.685849 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.685890 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.686254 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.686287 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzthx\" (UniqueName: \"kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.686331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.686363 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.689483 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.690155 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.690857 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.692710 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.697043 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.729268 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zb6\" (UniqueName: \"kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6\") pod \"dnsmasq-dns-bccf8f775-spkxd\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.787219 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.787322 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.787421 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzthx\" (UniqueName: \"kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.796965 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.809379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.829528 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzthx\" (UniqueName: \"kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx\") pod \"nova-scheduler-0\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.904379 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.968462 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:18 crc kubenswrapper[4696]: I1202 23:04:18.986676 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.073838 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gfgfp"] Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.263168 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerStarted","Data":"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650"} Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.313467 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gfgfp" event={"ID":"3c71f81a-6ed2-41fa-9600-f5afbeee2653","Type":"ContainerStarted","Data":"c760e012431cafc5e5b6ee4fae0bbb4640b5adc401c6515b872608c465446bc3"} Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.515828 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:19 crc kubenswrapper[4696]: W1202 23:04:19.564875 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd013870f_cb7b_4854_82d2_80c8e35701e9.slice/crio-13e2704974db79ec4a77e6b618cab1100d4039ff5fd306143e020c58a547042d WatchSource:0}: Error finding container 13e2704974db79ec4a77e6b618cab1100d4039ff5fd306143e020c58a547042d: Status 404 returned error can't find the container with id 13e2704974db79ec4a77e6b618cab1100d4039ff5fd306143e020c58a547042d Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.656236 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:19 crc kubenswrapper[4696]: I1202 23:04:19.831364 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.024862 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.037914 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7fk6v"] Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.045220 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.069323 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.089544 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.112880 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.146620 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7fk6v"] Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.174699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.174834 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.174882 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbwsd\" (UniqueName: \"kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.174929 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.276707 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.277530 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbwsd\" (UniqueName: \"kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.277704 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.278014 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.296578 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.296815 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.300542 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbwsd\" (UniqueName: \"kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.303506 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts\") pod \"nova-cell1-conductor-db-sync-7fk6v\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.332626 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"456d1feb-dedd-4045-8440-d45ca71d3f46","Type":"ContainerStarted","Data":"e5f1a7e9b454af6062cc30464eb242d649df3eac48a99040706476d1eed94d90"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.335880 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" event={"ID":"a097c9e9-bde7-443d-b08b-74859c58d517","Type":"ContainerStarted","Data":"f562ff40eba809e37e487f839fdcbe2e783e112f508b3b965d814c5f953141bd"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.341847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gfgfp" event={"ID":"3c71f81a-6ed2-41fa-9600-f5afbeee2653","Type":"ContainerStarted","Data":"ebbd9aa7d3c954796178286243dd7c13228fce4d1dddc62cbd72b83aa1198d04"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.352207 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerStarted","Data":"13e2704974db79ec4a77e6b618cab1100d4039ff5fd306143e020c58a547042d"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.355407 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c7cf934c-9238-4692-90db-19e7faaf7bfd","Type":"ContainerStarted","Data":"de5811578f1e94753f11beee6f50b3390860c0b9c3c440fc4ff2a74ebd927d52"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.367923 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerStarted","Data":"b54b9d73042df800328f885d040140e74760466c6a541020dc7e8b11aac3d5a7"} Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.387077 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gfgfp" podStartSLOduration=3.3870416309999998 podStartE2EDuration="3.387041631s" podCreationTimestamp="2025-12-02 23:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:20.373681341 +0000 UTC m=+1323.254361342" watchObservedRunningTime="2025-12-02 23:04:20.387041631 +0000 UTC m=+1323.267721632" Dec 02 23:04:20 crc kubenswrapper[4696]: I1202 23:04:20.451835 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:20 crc kubenswrapper[4696]: E1202 23:04:20.760276 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda097c9e9_bde7_443d_b08b_74859c58d517.slice/crio-ac989ba9a60cf21c24afdb5261960e7a4b32551b8a726082a866e112ba9d62fd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda097c9e9_bde7_443d_b08b_74859c58d517.slice/crio-conmon-ac989ba9a60cf21c24afdb5261960e7a4b32551b8a726082a866e112ba9d62fd.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.128801 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7fk6v"] Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.438173 4696 generic.go:334] "Generic (PLEG): container finished" podID="a097c9e9-bde7-443d-b08b-74859c58d517" containerID="ac989ba9a60cf21c24afdb5261960e7a4b32551b8a726082a866e112ba9d62fd" exitCode=0 Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.479049 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" event={"ID":"a097c9e9-bde7-443d-b08b-74859c58d517","Type":"ContainerDied","Data":"ac989ba9a60cf21c24afdb5261960e7a4b32551b8a726082a866e112ba9d62fd"} Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.479557 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" event={"ID":"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0","Type":"ContainerStarted","Data":"a963633294fa23e272f486a763271ee76fb354cf73898e6675d5ca6c365badfa"} Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.509991 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerStarted","Data":"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e"} Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.510038 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:04:21 crc kubenswrapper[4696]: I1202 23:04:21.615729 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.3579089570000002 podStartE2EDuration="7.615705146s" podCreationTimestamp="2025-12-02 23:04:14 +0000 UTC" firstStartedPulling="2025-12-02 23:04:15.806658454 +0000 UTC m=+1318.687338495" lastFinishedPulling="2025-12-02 23:04:20.064454683 +0000 UTC m=+1322.945134684" observedRunningTime="2025-12-02 23:04:21.578270272 +0000 UTC m=+1324.458950273" watchObservedRunningTime="2025-12-02 23:04:21.615705146 +0000 UTC m=+1324.496385147" Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.527967 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.534431 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" event={"ID":"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0","Type":"ContainerStarted","Data":"e3925fad130fa205634297b3447a41f37b659cd7947b3e48e6dd4fd618123680"} Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.562632 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" event={"ID":"a097c9e9-bde7-443d-b08b-74859c58d517","Type":"ContainerStarted","Data":"0c8f9b2aa71f3bd103a176c588f4efab692d5c78dd1a8959c2a9963d73ec4d34"} Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.564031 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.567326 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" podStartSLOduration=2.567307189 podStartE2EDuration="2.567307189s" podCreationTimestamp="2025-12-02 23:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:22.561976698 +0000 UTC m=+1325.442656699" watchObservedRunningTime="2025-12-02 23:04:22.567307189 +0000 UTC m=+1325.447987190" Dec 02 23:04:22 crc kubenswrapper[4696]: I1202 23:04:22.647077 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" podStartSLOduration=4.647046825 podStartE2EDuration="4.647046825s" podCreationTimestamp="2025-12-02 23:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:22.628831258 +0000 UTC m=+1325.509511259" watchObservedRunningTime="2025-12-02 23:04:22.647046825 +0000 UTC m=+1325.527726826" Dec 02 23:04:23 crc kubenswrapper[4696]: I1202 23:04:23.578101 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.615197 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerStarted","Data":"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.615955 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerStarted","Data":"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.615630 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-metadata" containerID="cri-o://ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" gracePeriod=30 Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.615397 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-log" containerID="cri-o://380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" gracePeriod=30 Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.622223 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"456d1feb-dedd-4045-8440-d45ca71d3f46","Type":"ContainerStarted","Data":"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.626831 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerStarted","Data":"275f8d8fdd44d31fb543ad73f136b6c17d5a47b1f19bbc8e7ad891b3ebbea091"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.626898 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerStarted","Data":"a8d0db8098805d176af7979d70cace4249c42ae4cb311d63688762491e73e5c9"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.632056 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c7cf934c-9238-4692-90db-19e7faaf7bfd","Type":"ContainerStarted","Data":"a430826aabb5ed2b42223d8e21ce2e5431e66f139d24f0bf3f66024819591f93"} Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.632187 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c7cf934c-9238-4692-90db-19e7faaf7bfd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a430826aabb5ed2b42223d8e21ce2e5431e66f139d24f0bf3f66024819591f93" gracePeriod=30 Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.666522 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.867171935 podStartE2EDuration="8.666492931s" podCreationTimestamp="2025-12-02 23:04:18 +0000 UTC" firstStartedPulling="2025-12-02 23:04:19.687976214 +0000 UTC m=+1322.568656215" lastFinishedPulling="2025-12-02 23:04:25.48729722 +0000 UTC m=+1328.367977211" observedRunningTime="2025-12-02 23:04:26.653377258 +0000 UTC m=+1329.534057299" watchObservedRunningTime="2025-12-02 23:04:26.666492931 +0000 UTC m=+1329.547172932" Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.697983 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.26177647 podStartE2EDuration="8.697945465s" podCreationTimestamp="2025-12-02 23:04:18 +0000 UTC" firstStartedPulling="2025-12-02 23:04:20.044841106 +0000 UTC m=+1322.925521107" lastFinishedPulling="2025-12-02 23:04:25.481010091 +0000 UTC m=+1328.361690102" observedRunningTime="2025-12-02 23:04:26.680477168 +0000 UTC m=+1329.561157169" watchObservedRunningTime="2025-12-02 23:04:26.697945465 +0000 UTC m=+1329.578625466" Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.761042 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.123341615 podStartE2EDuration="8.761013987s" podCreationTimestamp="2025-12-02 23:04:18 +0000 UTC" firstStartedPulling="2025-12-02 23:04:19.842048913 +0000 UTC m=+1322.722728914" lastFinishedPulling="2025-12-02 23:04:25.479721285 +0000 UTC m=+1328.360401286" observedRunningTime="2025-12-02 23:04:26.714011991 +0000 UTC m=+1329.594692002" watchObservedRunningTime="2025-12-02 23:04:26.761013987 +0000 UTC m=+1329.641693978" Dec 02 23:04:26 crc kubenswrapper[4696]: I1202 23:04:26.782620 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.870563681 podStartE2EDuration="9.78259159s" podCreationTimestamp="2025-12-02 23:04:17 +0000 UTC" firstStartedPulling="2025-12-02 23:04:19.569002103 +0000 UTC m=+1322.449682114" lastFinishedPulling="2025-12-02 23:04:25.481030022 +0000 UTC m=+1328.361710023" observedRunningTime="2025-12-02 23:04:26.750855118 +0000 UTC m=+1329.631535119" watchObservedRunningTime="2025-12-02 23:04:26.78259159 +0000 UTC m=+1329.663271591" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.249762 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.389434 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6s6l\" (UniqueName: \"kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l\") pod \"3f96aa59-4401-4d63-97df-15e687cfeae7\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.389711 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle\") pod \"3f96aa59-4401-4d63-97df-15e687cfeae7\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.389817 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data\") pod \"3f96aa59-4401-4d63-97df-15e687cfeae7\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.389864 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs\") pod \"3f96aa59-4401-4d63-97df-15e687cfeae7\" (UID: \"3f96aa59-4401-4d63-97df-15e687cfeae7\") " Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.390223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs" (OuterVolumeSpecName: "logs") pod "3f96aa59-4401-4d63-97df-15e687cfeae7" (UID: "3f96aa59-4401-4d63-97df-15e687cfeae7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.390383 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f96aa59-4401-4d63-97df-15e687cfeae7-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.395561 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l" (OuterVolumeSpecName: "kube-api-access-q6s6l") pod "3f96aa59-4401-4d63-97df-15e687cfeae7" (UID: "3f96aa59-4401-4d63-97df-15e687cfeae7"). InnerVolumeSpecName "kube-api-access-q6s6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.421370 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data" (OuterVolumeSpecName: "config-data") pod "3f96aa59-4401-4d63-97df-15e687cfeae7" (UID: "3f96aa59-4401-4d63-97df-15e687cfeae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.428689 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f96aa59-4401-4d63-97df-15e687cfeae7" (UID: "3f96aa59-4401-4d63-97df-15e687cfeae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.492813 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.492855 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f96aa59-4401-4d63-97df-15e687cfeae7-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.492864 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6s6l\" (UniqueName: \"kubernetes.io/projected/3f96aa59-4401-4d63-97df-15e687cfeae7-kube-api-access-q6s6l\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648507 4696 generic.go:334] "Generic (PLEG): container finished" podID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerID="ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" exitCode=0 Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648556 4696 generic.go:334] "Generic (PLEG): container finished" podID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerID="380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" exitCode=143 Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648571 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648593 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerDied","Data":"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd"} Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648674 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerDied","Data":"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f"} Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648697 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f96aa59-4401-4d63-97df-15e687cfeae7","Type":"ContainerDied","Data":"b54b9d73042df800328f885d040140e74760466c6a541020dc7e8b11aac3d5a7"} Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.648728 4696 scope.go:117] "RemoveContainer" containerID="ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.685334 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.694319 4696 scope.go:117] "RemoveContainer" containerID="380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.710537 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.720029 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:27 crc kubenswrapper[4696]: E1202 23:04:27.720584 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-metadata" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.720603 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-metadata" Dec 02 23:04:27 crc kubenswrapper[4696]: E1202 23:04:27.720624 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-log" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.720630 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-log" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.720859 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-metadata" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.720872 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" containerName="nova-metadata-log" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.722600 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.726421 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.726605 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.746422 4696 scope.go:117] "RemoveContainer" containerID="ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" Dec 02 23:04:27 crc kubenswrapper[4696]: E1202 23:04:27.749594 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd\": container with ID starting with ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd not found: ID does not exist" containerID="ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.749643 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd"} err="failed to get container status \"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd\": rpc error: code = NotFound desc = could not find container \"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd\": container with ID starting with ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd not found: ID does not exist" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.749675 4696 scope.go:117] "RemoveContainer" containerID="380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" Dec 02 23:04:27 crc kubenswrapper[4696]: E1202 23:04:27.751774 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f\": container with ID starting with 380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f not found: ID does not exist" containerID="380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.751804 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f"} err="failed to get container status \"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f\": rpc error: code = NotFound desc = could not find container \"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f\": container with ID starting with 380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f not found: ID does not exist" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.751822 4696 scope.go:117] "RemoveContainer" containerID="ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.755550 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd"} err="failed to get container status \"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd\": rpc error: code = NotFound desc = could not find container \"ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd\": container with ID starting with ccee82c86e63cc037064872d30a9ee15180aadae2599ec3cd8ca020b152388cd not found: ID does not exist" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.755578 4696 scope.go:117] "RemoveContainer" containerID="380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.758015 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f"} err="failed to get container status \"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f\": rpc error: code = NotFound desc = could not find container \"380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f\": container with ID starting with 380b67a91a75e7f812c8b1a49c182b7df4737a2c75c2d2659cec92945ade436f not found: ID does not exist" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.774072 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.904561 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.904777 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.904868 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7mss\" (UniqueName: \"kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.904952 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:27 crc kubenswrapper[4696]: I1202 23:04:27.905016 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.007999 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.008070 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.008156 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.008177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.008242 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7mss\" (UniqueName: \"kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.008688 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.014277 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.016104 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.020643 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.041525 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7mss\" (UniqueName: \"kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss\") pod \"nova-metadata-0\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.061632 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.570581 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.571062 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.608940 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.670039 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerStarted","Data":"1dc0afb0b4c6f085c04e33a00f5d36fd71b0b04940bcfd7293763622d2243f9e"} Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.905884 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.970946 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.990099 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:04:28 crc kubenswrapper[4696]: I1202 23:04:28.990439 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.067462 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.078420 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.078979 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="dnsmasq-dns" containerID="cri-o://fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1" gracePeriod=10 Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.508856 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f96aa59-4401-4d63-97df-15e687cfeae7" path="/var/lib/kubelet/pods/3f96aa59-4401-4d63-97df-15e687cfeae7/volumes" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.539809 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.653125 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.653649 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.670557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.670700 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rm2k\" (UniqueName: \"kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.670808 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.670967 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.671055 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.671300 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc\") pod \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\" (UID: \"61f3e755-0ed7-4e18-aa16-11e0ebc89957\") " Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.719894 4696 generic.go:334] "Generic (PLEG): container finished" podID="3c71f81a-6ed2-41fa-9600-f5afbeee2653" containerID="ebbd9aa7d3c954796178286243dd7c13228fce4d1dddc62cbd72b83aa1198d04" exitCode=0 Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.719962 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gfgfp" event={"ID":"3c71f81a-6ed2-41fa-9600-f5afbeee2653","Type":"ContainerDied","Data":"ebbd9aa7d3c954796178286243dd7c13228fce4d1dddc62cbd72b83aa1198d04"} Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.721880 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerStarted","Data":"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14"} Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.721904 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerStarted","Data":"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378"} Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.724910 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k" (OuterVolumeSpecName: "kube-api-access-6rm2k") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "kube-api-access-6rm2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.725065 4696 generic.go:334] "Generic (PLEG): container finished" podID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerID="fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1" exitCode=0 Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.725117 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" event={"ID":"61f3e755-0ed7-4e18-aa16-11e0ebc89957","Type":"ContainerDied","Data":"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1"} Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.725220 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" event={"ID":"61f3e755-0ed7-4e18-aa16-11e0ebc89957","Type":"ContainerDied","Data":"8ac3e712e59d18648562672a4f91ae51c75e5a20916f0449a3bd9ff26ee29bf0"} Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.725246 4696 scope.go:117] "RemoveContainer" containerID="fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.725541 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9m9jd" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.774215 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rm2k\" (UniqueName: \"kubernetes.io/projected/61f3e755-0ed7-4e18-aa16-11e0ebc89957-kube-api-access-6rm2k\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.775592 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.789354 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config" (OuterVolumeSpecName: "config") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.792977 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.792955239 podStartE2EDuration="2.792955239s" podCreationTimestamp="2025-12-02 23:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:29.778706084 +0000 UTC m=+1332.659386085" watchObservedRunningTime="2025-12-02 23:04:29.792955239 +0000 UTC m=+1332.673635240" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.795440 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.810758 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.817333 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.874298 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61f3e755-0ed7-4e18-aa16-11e0ebc89957" (UID: "61f3e755-0ed7-4e18-aa16-11e0ebc89957"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.876855 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.879568 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.879629 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.879644 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.879661 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f3e755-0ed7-4e18-aa16-11e0ebc89957-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.945576 4696 scope.go:117] "RemoveContainer" containerID="28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.976213 4696 scope.go:117] "RemoveContainer" containerID="fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1" Dec 02 23:04:29 crc kubenswrapper[4696]: E1202 23:04:29.977062 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1\": container with ID starting with fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1 not found: ID does not exist" containerID="fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.977158 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1"} err="failed to get container status \"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1\": rpc error: code = NotFound desc = could not find container \"fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1\": container with ID starting with fd28de3f291c164d4ca345a48403ee4d226db9bcaf74f6bbdebe048067e455e1 not found: ID does not exist" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.977205 4696 scope.go:117] "RemoveContainer" containerID="28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2" Dec 02 23:04:29 crc kubenswrapper[4696]: E1202 23:04:29.977725 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2\": container with ID starting with 28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2 not found: ID does not exist" containerID="28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2" Dec 02 23:04:29 crc kubenswrapper[4696]: I1202 23:04:29.977808 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2"} err="failed to get container status \"28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2\": rpc error: code = NotFound desc = could not find container \"28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2\": container with ID starting with 28cbb32f93941bbfbf01804d2c5281defe14b3fe2e5cdd8e75e81fdb27bf40b2 not found: ID does not exist" Dec 02 23:04:30 crc kubenswrapper[4696]: I1202 23:04:30.072978 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:04:30 crc kubenswrapper[4696]: I1202 23:04:30.094325 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9m9jd"] Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.216336 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.311156 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle\") pod \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.311313 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts\") pod \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.311369 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kt6l\" (UniqueName: \"kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l\") pod \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.311617 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") pod \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.660496 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l" (OuterVolumeSpecName: "kube-api-access-7kt6l") pod "3c71f81a-6ed2-41fa-9600-f5afbeee2653" (UID: "3c71f81a-6ed2-41fa-9600-f5afbeee2653"). InnerVolumeSpecName "kube-api-access-7kt6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.663964 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts" (OuterVolumeSpecName: "scripts") pod "3c71f81a-6ed2-41fa-9600-f5afbeee2653" (UID: "3c71f81a-6ed2-41fa-9600-f5afbeee2653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:31 crc kubenswrapper[4696]: E1202 23:04:31.664128 4696 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data podName:3c71f81a-6ed2-41fa-9600-f5afbeee2653 nodeName:}" failed. No retries permitted until 2025-12-02 23:04:32.164093483 +0000 UTC m=+1335.044773494 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data") pod "3c71f81a-6ed2-41fa-9600-f5afbeee2653" (UID: "3c71f81a-6ed2-41fa-9600-f5afbeee2653") : error deleting /var/lib/kubelet/pods/3c71f81a-6ed2-41fa-9600-f5afbeee2653/volume-subpaths: remove /var/lib/kubelet/pods/3c71f81a-6ed2-41fa-9600-f5afbeee2653/volume-subpaths: no such file or directory Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.671902 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c71f81a-6ed2-41fa-9600-f5afbeee2653" (UID: "3c71f81a-6ed2-41fa-9600-f5afbeee2653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.681480 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" path="/var/lib/kubelet/pods/61f3e755-0ed7-4e18-aa16-11e0ebc89957/volumes" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.754325 4696 generic.go:334] "Generic (PLEG): container finished" podID="95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" containerID="e3925fad130fa205634297b3447a41f37b659cd7947b3e48e6dd4fd618123680" exitCode=0 Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.755948 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.755975 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.755992 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kt6l\" (UniqueName: \"kubernetes.io/projected/3c71f81a-6ed2-41fa-9600-f5afbeee2653-kube-api-access-7kt6l\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.762073 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gfgfp" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.768400 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" event={"ID":"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0","Type":"ContainerDied","Data":"e3925fad130fa205634297b3447a41f37b659cd7947b3e48e6dd4fd618123680"} Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.768448 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gfgfp" event={"ID":"3c71f81a-6ed2-41fa-9600-f5afbeee2653","Type":"ContainerDied","Data":"c760e012431cafc5e5b6ee4fae0bbb4640b5adc401c6515b872608c465446bc3"} Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.768469 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c760e012431cafc5e5b6ee4fae0bbb4640b5adc401c6515b872608c465446bc3" Dec 02 23:04:31 crc kubenswrapper[4696]: I1202 23:04:31.991830 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.008919 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.009551 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-log" containerID="cri-o://a8d0db8098805d176af7979d70cace4249c42ae4cb311d63688762491e73e5c9" gracePeriod=30 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.009899 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-api" containerID="cri-o://275f8d8fdd44d31fb543ad73f136b6c17d5a47b1f19bbc8e7ad891b3ebbea091" gracePeriod=30 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.023285 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.023573 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-log" containerID="cri-o://4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" gracePeriod=30 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.023999 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-metadata" containerID="cri-o://55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" gracePeriod=30 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.175940 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") pod \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\" (UID: \"3c71f81a-6ed2-41fa-9600-f5afbeee2653\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.185184 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data" (OuterVolumeSpecName: "config-data") pod "3c71f81a-6ed2-41fa-9600-f5afbeee2653" (UID: "3c71f81a-6ed2-41fa-9600-f5afbeee2653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.279250 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71f81a-6ed2-41fa-9600-f5afbeee2653-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.566465 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690295 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs\") pod \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690370 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs\") pod \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690430 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle\") pod \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690489 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data\") pod \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690663 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7mss\" (UniqueName: \"kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss\") pod \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\" (UID: \"6fd0cdbf-1cc8-470f-bc09-437144b1638f\") " Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.690845 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs" (OuterVolumeSpecName: "logs") pod "6fd0cdbf-1cc8-470f-bc09-437144b1638f" (UID: "6fd0cdbf-1cc8-470f-bc09-437144b1638f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.691474 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd0cdbf-1cc8-470f-bc09-437144b1638f-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.699919 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss" (OuterVolumeSpecName: "kube-api-access-d7mss") pod "6fd0cdbf-1cc8-470f-bc09-437144b1638f" (UID: "6fd0cdbf-1cc8-470f-bc09-437144b1638f"). InnerVolumeSpecName "kube-api-access-d7mss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.729862 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data" (OuterVolumeSpecName: "config-data") pod "6fd0cdbf-1cc8-470f-bc09-437144b1638f" (UID: "6fd0cdbf-1cc8-470f-bc09-437144b1638f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.734990 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fd0cdbf-1cc8-470f-bc09-437144b1638f" (UID: "6fd0cdbf-1cc8-470f-bc09-437144b1638f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.762939 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6fd0cdbf-1cc8-470f-bc09-437144b1638f" (UID: "6fd0cdbf-1cc8-470f-bc09-437144b1638f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.783675 4696 generic.go:334] "Generic (PLEG): container finished" podID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerID="a8d0db8098805d176af7979d70cace4249c42ae4cb311d63688762491e73e5c9" exitCode=143 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.783765 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerDied","Data":"a8d0db8098805d176af7979d70cace4249c42ae4cb311d63688762491e73e5c9"} Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.785990 4696 generic.go:334] "Generic (PLEG): container finished" podID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerID="55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" exitCode=0 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786010 4696 generic.go:334] "Generic (PLEG): container finished" podID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerID="4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" exitCode=143 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786098 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786155 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerDied","Data":"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14"} Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786200 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerName="nova-scheduler-scheduler" containerID="cri-o://7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" gracePeriod=30 Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786222 4696 scope.go:117] "RemoveContainer" containerID="55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.786207 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerDied","Data":"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378"} Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.788812 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6fd0cdbf-1cc8-470f-bc09-437144b1638f","Type":"ContainerDied","Data":"1dc0afb0b4c6f085c04e33a00f5d36fd71b0b04940bcfd7293763622d2243f9e"} Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.793439 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.793462 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.793472 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd0cdbf-1cc8-470f-bc09-437144b1638f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.793482 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7mss\" (UniqueName: \"kubernetes.io/projected/6fd0cdbf-1cc8-470f-bc09-437144b1638f-kube-api-access-d7mss\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.870572 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.877994 4696 scope.go:117] "RemoveContainer" containerID="4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.891529 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.917877 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.918434 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-log" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918451 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-log" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.918480 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="dnsmasq-dns" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918487 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="dnsmasq-dns" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.918498 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c71f81a-6ed2-41fa-9600-f5afbeee2653" containerName="nova-manage" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918504 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c71f81a-6ed2-41fa-9600-f5afbeee2653" containerName="nova-manage" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.918536 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="init" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918544 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="init" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.918554 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-metadata" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918560 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-metadata" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918773 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f3e755-0ed7-4e18-aa16-11e0ebc89957" containerName="dnsmasq-dns" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918794 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-metadata" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918826 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c71f81a-6ed2-41fa-9600-f5afbeee2653" containerName="nova-manage" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.918841 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" containerName="nova-metadata-log" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.920368 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.924999 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.925205 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.933144 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.947529 4696 scope.go:117] "RemoveContainer" containerID="55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.950707 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14\": container with ID starting with 55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14 not found: ID does not exist" containerID="55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.950815 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14"} err="failed to get container status \"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14\": rpc error: code = NotFound desc = could not find container \"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14\": container with ID starting with 55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14 not found: ID does not exist" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.950851 4696 scope.go:117] "RemoveContainer" containerID="4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" Dec 02 23:04:32 crc kubenswrapper[4696]: E1202 23:04:32.951810 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378\": container with ID starting with 4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378 not found: ID does not exist" containerID="4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.951840 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378"} err="failed to get container status \"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378\": rpc error: code = NotFound desc = could not find container \"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378\": container with ID starting with 4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378 not found: ID does not exist" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.951857 4696 scope.go:117] "RemoveContainer" containerID="55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.952241 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14"} err="failed to get container status \"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14\": rpc error: code = NotFound desc = could not find container \"55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14\": container with ID starting with 55f0fb9d829e754e87d0ef6523497e4f0c40eb2544574ecbc6504cfe894b6e14 not found: ID does not exist" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.952265 4696 scope.go:117] "RemoveContainer" containerID="4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378" Dec 02 23:04:32 crc kubenswrapper[4696]: I1202 23:04:32.953163 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378"} err="failed to get container status \"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378\": rpc error: code = NotFound desc = could not find container \"4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378\": container with ID starting with 4035160e8a1f9bc09e09e55642a9ddc4099838e31825f4998c6296dfe5fc9378 not found: ID does not exist" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.106872 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.107499 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ngfp\" (UniqueName: \"kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.107689 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.107779 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.107865 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.186654 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.211594 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle\") pod \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.212129 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts\") pod \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.212531 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbwsd\" (UniqueName: \"kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd\") pod \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.212678 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data\") pod \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\" (UID: \"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0\") " Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.213150 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.213201 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.213258 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.214561 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.214633 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ngfp\" (UniqueName: \"kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.215557 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.220164 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.221196 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd" (OuterVolumeSpecName: "kube-api-access-wbwsd") pod "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" (UID: "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0"). InnerVolumeSpecName "kube-api-access-wbwsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.223038 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.223272 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts" (OuterVolumeSpecName: "scripts") pod "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" (UID: "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.225200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.238944 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ngfp\" (UniqueName: \"kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp\") pod \"nova-metadata-0\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.251099 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.251684 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" (UID: "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.262326 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data" (OuterVolumeSpecName: "config-data") pod "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" (UID: "95634dd9-11ed-4c9d-b0e3-b7240ff94ac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.316508 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbwsd\" (UniqueName: \"kubernetes.io/projected/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-kube-api-access-wbwsd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.316552 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.316567 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.316577 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.445935 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd0cdbf-1cc8-470f-bc09-437144b1638f" path="/var/lib/kubelet/pods/6fd0cdbf-1cc8-470f-bc09-437144b1638f/volumes" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.776640 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.802101 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.802145 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7fk6v" event={"ID":"95634dd9-11ed-4c9d-b0e3-b7240ff94ac0","Type":"ContainerDied","Data":"a963633294fa23e272f486a763271ee76fb354cf73898e6675d5ca6c365badfa"} Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.802205 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a963633294fa23e272f486a763271ee76fb354cf73898e6675d5ca6c365badfa" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.804249 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerStarted","Data":"545cabcad2ae396cc2a6bcfba0ea06ade06cba66a556d9b3eea64f328b075afd"} Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.886896 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:04:33 crc kubenswrapper[4696]: E1202 23:04:33.887389 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" containerName="nova-cell1-conductor-db-sync" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.887408 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" containerName="nova-cell1-conductor-db-sync" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.887636 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" containerName="nova-cell1-conductor-db-sync" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.888390 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.890547 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 23:04:33 crc kubenswrapper[4696]: I1202 23:04:33.920419 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:04:33 crc kubenswrapper[4696]: E1202 23:04:33.991022 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af is running failed: container process not found" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:04:33 crc kubenswrapper[4696]: E1202 23:04:33.992987 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af is running failed: container process not found" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:04:33 crc kubenswrapper[4696]: E1202 23:04:33.993503 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af is running failed: container process not found" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:04:33 crc kubenswrapper[4696]: E1202 23:04:33.993535 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerName="nova-scheduler-scheduler" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.029635 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.029722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.029924 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxd48\" (UniqueName: \"kubernetes.io/projected/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-kube-api-access-kxd48\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.139828 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxd48\" (UniqueName: \"kubernetes.io/projected/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-kube-api-access-kxd48\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.140004 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.140038 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.150448 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.150849 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.170568 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxd48\" (UniqueName: \"kubernetes.io/projected/4c7be3d4-52ad-4671-8b48-8cc19cf98b4c-kube-api-access-kxd48\") pod \"nova-cell1-conductor-0\" (UID: \"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.271017 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.344842 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzthx\" (UniqueName: \"kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx\") pod \"456d1feb-dedd-4045-8440-d45ca71d3f46\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.344919 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data\") pod \"456d1feb-dedd-4045-8440-d45ca71d3f46\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.345151 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle\") pod \"456d1feb-dedd-4045-8440-d45ca71d3f46\" (UID: \"456d1feb-dedd-4045-8440-d45ca71d3f46\") " Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.351131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx" (OuterVolumeSpecName: "kube-api-access-hzthx") pod "456d1feb-dedd-4045-8440-d45ca71d3f46" (UID: "456d1feb-dedd-4045-8440-d45ca71d3f46"). InnerVolumeSpecName "kube-api-access-hzthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.384148 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "456d1feb-dedd-4045-8440-d45ca71d3f46" (UID: "456d1feb-dedd-4045-8440-d45ca71d3f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.387038 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data" (OuterVolumeSpecName: "config-data") pod "456d1feb-dedd-4045-8440-d45ca71d3f46" (UID: "456d1feb-dedd-4045-8440-d45ca71d3f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.427642 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.449145 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzthx\" (UniqueName: \"kubernetes.io/projected/456d1feb-dedd-4045-8440-d45ca71d3f46-kube-api-access-hzthx\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.449194 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.449205 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456d1feb-dedd-4045-8440-d45ca71d3f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.815108 4696 generic.go:334] "Generic (PLEG): container finished" podID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" exitCode=0 Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.815311 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"456d1feb-dedd-4045-8440-d45ca71d3f46","Type":"ContainerDied","Data":"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af"} Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.815451 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.815618 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"456d1feb-dedd-4045-8440-d45ca71d3f46","Type":"ContainerDied","Data":"e5f1a7e9b454af6062cc30464eb242d649df3eac48a99040706476d1eed94d90"} Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.815654 4696 scope.go:117] "RemoveContainer" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.818677 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerStarted","Data":"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302"} Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.818703 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerStarted","Data":"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033"} Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.858577 4696 scope.go:117] "RemoveContainer" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" Dec 02 23:04:34 crc kubenswrapper[4696]: E1202 23:04:34.862327 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af\": container with ID starting with 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af not found: ID does not exist" containerID="7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.862387 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af"} err="failed to get container status \"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af\": rpc error: code = NotFound desc = could not find container \"7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af\": container with ID starting with 7be7fcb711e4bfb6f71f88c2ff267f7c66c9e2d741a648379982ede50631d4af not found: ID does not exist" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.872212 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8721856519999998 podStartE2EDuration="2.872185652s" podCreationTimestamp="2025-12-02 23:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:34.861475108 +0000 UTC m=+1337.742155099" watchObservedRunningTime="2025-12-02 23:04:34.872185652 +0000 UTC m=+1337.752865653" Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.920087 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.938788 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:34 crc kubenswrapper[4696]: I1202 23:04:34.974064 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.016548 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:35 crc kubenswrapper[4696]: E1202 23:04:35.020507 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerName="nova-scheduler-scheduler" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.020537 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerName="nova-scheduler-scheduler" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.030377 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" containerName="nova-scheduler-scheduler" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.031356 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.039196 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.051985 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.066497 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.066567 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq2ph\" (UniqueName: \"kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.066596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.168525 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.169019 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq2ph\" (UniqueName: \"kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.169049 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.174921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.182176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.188401 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq2ph\" (UniqueName: \"kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph\") pod \"nova-scheduler-0\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.407864 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.443324 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456d1feb-dedd-4045-8440-d45ca71d3f46" path="/var/lib/kubelet/pods/456d1feb-dedd-4045-8440-d45ca71d3f46/volumes" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.839482 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c","Type":"ContainerStarted","Data":"6e63ac97979ea13620397bddc16f1d74bf617c287826494863fc651e644d477d"} Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.839923 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4c7be3d4-52ad-4671-8b48-8cc19cf98b4c","Type":"ContainerStarted","Data":"b444f6c5904065b7249c1a43e9d60be7b84bcf7ac7c36f90696340b8bfe48209"} Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.839972 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.868591 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.868567077 podStartE2EDuration="2.868567077s" podCreationTimestamp="2025-12-02 23:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:35.856306569 +0000 UTC m=+1338.736986570" watchObservedRunningTime="2025-12-02 23:04:35.868567077 +0000 UTC m=+1338.749247078" Dec 02 23:04:35 crc kubenswrapper[4696]: I1202 23:04:35.890896 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:04:35 crc kubenswrapper[4696]: W1202 23:04:35.892204 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e7bee2_7394_4980_9f24_e50448cda21a.slice/crio-fd1bf7d79edfa1ceafc14004e346c5b9c7558129ee3050aaa28f92554097e0f0 WatchSource:0}: Error finding container fd1bf7d79edfa1ceafc14004e346c5b9c7558129ee3050aaa28f92554097e0f0: Status 404 returned error can't find the container with id fd1bf7d79edfa1ceafc14004e346c5b9c7558129ee3050aaa28f92554097e0f0 Dec 02 23:04:36 crc kubenswrapper[4696]: I1202 23:04:36.854550 4696 generic.go:334] "Generic (PLEG): container finished" podID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerID="275f8d8fdd44d31fb543ad73f136b6c17d5a47b1f19bbc8e7ad891b3ebbea091" exitCode=0 Dec 02 23:04:36 crc kubenswrapper[4696]: I1202 23:04:36.854602 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerDied","Data":"275f8d8fdd44d31fb543ad73f136b6c17d5a47b1f19bbc8e7ad891b3ebbea091"} Dec 02 23:04:36 crc kubenswrapper[4696]: I1202 23:04:36.858010 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1e7bee2-7394-4980-9f24-e50448cda21a","Type":"ContainerStarted","Data":"33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2"} Dec 02 23:04:36 crc kubenswrapper[4696]: I1202 23:04:36.858576 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1e7bee2-7394-4980-9f24-e50448cda21a","Type":"ContainerStarted","Data":"fd1bf7d79edfa1ceafc14004e346c5b9c7558129ee3050aaa28f92554097e0f0"} Dec 02 23:04:36 crc kubenswrapper[4696]: I1202 23:04:36.895814 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.895724197 podStartE2EDuration="2.895724197s" podCreationTimestamp="2025-12-02 23:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:36.882175062 +0000 UTC m=+1339.762855073" watchObservedRunningTime="2025-12-02 23:04:36.895724197 +0000 UTC m=+1339.776404208" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.002363 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.016438 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data\") pod \"d013870f-cb7b-4854-82d2-80c8e35701e9\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.016607 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkjtb\" (UniqueName: \"kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb\") pod \"d013870f-cb7b-4854-82d2-80c8e35701e9\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.016671 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle\") pod \"d013870f-cb7b-4854-82d2-80c8e35701e9\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.016841 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs\") pod \"d013870f-cb7b-4854-82d2-80c8e35701e9\" (UID: \"d013870f-cb7b-4854-82d2-80c8e35701e9\") " Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.017482 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs" (OuterVolumeSpecName: "logs") pod "d013870f-cb7b-4854-82d2-80c8e35701e9" (UID: "d013870f-cb7b-4854-82d2-80c8e35701e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.017891 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d013870f-cb7b-4854-82d2-80c8e35701e9-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.037995 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb" (OuterVolumeSpecName: "kube-api-access-xkjtb") pod "d013870f-cb7b-4854-82d2-80c8e35701e9" (UID: "d013870f-cb7b-4854-82d2-80c8e35701e9"). InnerVolumeSpecName "kube-api-access-xkjtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.073724 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data" (OuterVolumeSpecName: "config-data") pod "d013870f-cb7b-4854-82d2-80c8e35701e9" (UID: "d013870f-cb7b-4854-82d2-80c8e35701e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.074596 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d013870f-cb7b-4854-82d2-80c8e35701e9" (UID: "d013870f-cb7b-4854-82d2-80c8e35701e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.120800 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.120833 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkjtb\" (UniqueName: \"kubernetes.io/projected/d013870f-cb7b-4854-82d2-80c8e35701e9-kube-api-access-xkjtb\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.120845 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d013870f-cb7b-4854-82d2-80c8e35701e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.872169 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.872239 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d013870f-cb7b-4854-82d2-80c8e35701e9","Type":"ContainerDied","Data":"13e2704974db79ec4a77e6b618cab1100d4039ff5fd306143e020c58a547042d"} Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.872282 4696 scope.go:117] "RemoveContainer" containerID="275f8d8fdd44d31fb543ad73f136b6c17d5a47b1f19bbc8e7ad891b3ebbea091" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.912214 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.912893 4696 scope.go:117] "RemoveContainer" containerID="a8d0db8098805d176af7979d70cace4249c42ae4cb311d63688762491e73e5c9" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.932370 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.950019 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:37 crc kubenswrapper[4696]: E1202 23:04:37.950552 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-api" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.950573 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-api" Dec 02 23:04:37 crc kubenswrapper[4696]: E1202 23:04:37.950617 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-log" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.950624 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-log" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.951717 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-log" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.951765 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" containerName="nova-api-api" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.953192 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.957512 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:04:37 crc kubenswrapper[4696]: I1202 23:04:37.973490 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.050098 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds2fc\" (UniqueName: \"kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.050163 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.050400 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.050992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.153686 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.153773 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.153904 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.153957 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds2fc\" (UniqueName: \"kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.154716 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.161427 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.161914 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.176458 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds2fc\" (UniqueName: \"kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc\") pod \"nova-api-0\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.251283 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.252602 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.286555 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.828856 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:04:38 crc kubenswrapper[4696]: W1202 23:04:38.836148 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5c4775b_133c_454d_bdbc_8435912faecb.slice/crio-e73b143af00487ec7af0ee04d777345c029f8d4791100549d88506f296a2d09f WatchSource:0}: Error finding container e73b143af00487ec7af0ee04d777345c029f8d4791100549d88506f296a2d09f: Status 404 returned error can't find the container with id e73b143af00487ec7af0ee04d777345c029f8d4791100549d88506f296a2d09f Dec 02 23:04:38 crc kubenswrapper[4696]: I1202 23:04:38.887932 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerStarted","Data":"e73b143af00487ec7af0ee04d777345c029f8d4791100549d88506f296a2d09f"} Dec 02 23:04:39 crc kubenswrapper[4696]: I1202 23:04:39.446182 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d013870f-cb7b-4854-82d2-80c8e35701e9" path="/var/lib/kubelet/pods/d013870f-cb7b-4854-82d2-80c8e35701e9/volumes" Dec 02 23:04:39 crc kubenswrapper[4696]: I1202 23:04:39.905342 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerStarted","Data":"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0"} Dec 02 23:04:39 crc kubenswrapper[4696]: I1202 23:04:39.905942 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerStarted","Data":"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4"} Dec 02 23:04:39 crc kubenswrapper[4696]: I1202 23:04:39.936566 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.936539232 podStartE2EDuration="2.936539232s" podCreationTimestamp="2025-12-02 23:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:04:39.933969679 +0000 UTC m=+1342.814649690" watchObservedRunningTime="2025-12-02 23:04:39.936539232 +0000 UTC m=+1342.817219243" Dec 02 23:04:40 crc kubenswrapper[4696]: I1202 23:04:40.408952 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:04:43 crc kubenswrapper[4696]: I1202 23:04:43.251406 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:04:43 crc kubenswrapper[4696]: I1202 23:04:43.252385 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:04:44 crc kubenswrapper[4696]: I1202 23:04:44.269969 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:44 crc kubenswrapper[4696]: I1202 23:04:44.270357 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:44 crc kubenswrapper[4696]: I1202 23:04:44.482367 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 23:04:45 crc kubenswrapper[4696]: I1202 23:04:45.338101 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 23:04:45 crc kubenswrapper[4696]: I1202 23:04:45.408380 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:04:45 crc kubenswrapper[4696]: I1202 23:04:45.450134 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:04:46 crc kubenswrapper[4696]: I1202 23:04:46.020169 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:04:48 crc kubenswrapper[4696]: I1202 23:04:48.287381 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:04:48 crc kubenswrapper[4696]: I1202 23:04:48.289483 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:04:49 crc kubenswrapper[4696]: I1202 23:04:49.371048 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:49 crc kubenswrapper[4696]: I1202 23:04:49.371040 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:49 crc kubenswrapper[4696]: I1202 23:04:49.630893 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:49 crc kubenswrapper[4696]: I1202 23:04:49.631514 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" containerName="kube-state-metrics" containerID="cri-o://9ea4fe4d79119b702b85dc57f5455fe3da7151c76af7e78aec68d6f8390427f9" gracePeriod=30 Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.032123 4696 generic.go:334] "Generic (PLEG): container finished" podID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" containerID="9ea4fe4d79119b702b85dc57f5455fe3da7151c76af7e78aec68d6f8390427f9" exitCode=2 Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.032177 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6","Type":"ContainerDied","Data":"9ea4fe4d79119b702b85dc57f5455fe3da7151c76af7e78aec68d6f8390427f9"} Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.139670 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.234006 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhkp7\" (UniqueName: \"kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7\") pod \"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6\" (UID: \"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6\") " Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.244096 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7" (OuterVolumeSpecName: "kube-api-access-mhkp7") pod "89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" (UID: "89fbc0cf-3e41-4b34-bdd3-b415552fd1e6"). InnerVolumeSpecName "kube-api-access-mhkp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:50 crc kubenswrapper[4696]: I1202 23:04:50.336927 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhkp7\" (UniqueName: \"kubernetes.io/projected/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6-kube-api-access-mhkp7\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.049164 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"89fbc0cf-3e41-4b34-bdd3-b415552fd1e6","Type":"ContainerDied","Data":"0cf3bff85eb2b138daafe92d1c9d1ea9b0bbde6a676933eb5ccf7f912c231d37"} Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.049867 4696 scope.go:117] "RemoveContainer" containerID="9ea4fe4d79119b702b85dc57f5455fe3da7151c76af7e78aec68d6f8390427f9" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.049282 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.117200 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.133820 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.147342 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:51 crc kubenswrapper[4696]: E1202 23:04:51.148029 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" containerName="kube-state-metrics" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.148052 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" containerName="kube-state-metrics" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.148277 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" containerName="kube-state-metrics" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.150200 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152076 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152394 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152703 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152806 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc6pl\" (UniqueName: \"kubernetes.io/projected/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-api-access-sc6pl\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152853 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.152875 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.161512 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.255310 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc6pl\" (UniqueName: \"kubernetes.io/projected/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-api-access-sc6pl\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.255384 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.255412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.255509 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.263000 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.263089 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.263396 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.275969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc6pl\" (UniqueName: \"kubernetes.io/projected/c200fd15-55ea-4c23-a8d4-22c362deedee-kube-api-access-sc6pl\") pod \"kube-state-metrics-0\" (UID: \"c200fd15-55ea-4c23-a8d4-22c362deedee\") " pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.443813 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fbc0cf-3e41-4b34-bdd3-b415552fd1e6" path="/var/lib/kubelet/pods/89fbc0cf-3e41-4b34-bdd3-b415552fd1e6/volumes" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.468954 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.899303 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.900269 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-central-agent" containerID="cri-o://ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9" gracePeriod=30 Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.900817 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="proxy-httpd" containerID="cri-o://74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e" gracePeriod=30 Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.900919 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="sg-core" containerID="cri-o://f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650" gracePeriod=30 Dec 02 23:04:51 crc kubenswrapper[4696]: I1202 23:04:51.901015 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-notification-agent" containerID="cri-o://e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002" gracePeriod=30 Dec 02 23:04:52 crc kubenswrapper[4696]: I1202 23:04:52.088428 4696 generic.go:334] "Generic (PLEG): container finished" podID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerID="f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650" exitCode=2 Dec 02 23:04:52 crc kubenswrapper[4696]: I1202 23:04:52.088654 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerDied","Data":"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650"} Dec 02 23:04:52 crc kubenswrapper[4696]: I1202 23:04:52.107038 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.108442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c200fd15-55ea-4c23-a8d4-22c362deedee","Type":"ContainerStarted","Data":"d2bf59d71d57a047e169f83456e5be9b2e6f9b2c668a190768730db4615a0bfd"} Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.108504 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c200fd15-55ea-4c23-a8d4-22c362deedee","Type":"ContainerStarted","Data":"b07f07f47207f668f949139c7d447db16965bc590b96bb424bbf6a97b4130c32"} Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.108583 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.112552 4696 generic.go:334] "Generic (PLEG): container finished" podID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerID="74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e" exitCode=0 Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.112597 4696 generic.go:334] "Generic (PLEG): container finished" podID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerID="ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9" exitCode=0 Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.112618 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerDied","Data":"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e"} Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.112656 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerDied","Data":"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9"} Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.179175 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7759428640000001 podStartE2EDuration="2.179144382s" podCreationTimestamp="2025-12-02 23:04:51 +0000 UTC" firstStartedPulling="2025-12-02 23:04:52.12476428 +0000 UTC m=+1355.005444281" lastFinishedPulling="2025-12-02 23:04:52.527965788 +0000 UTC m=+1355.408645799" observedRunningTime="2025-12-02 23:04:53.170190088 +0000 UTC m=+1356.050870089" watchObservedRunningTime="2025-12-02 23:04:53.179144382 +0000 UTC m=+1356.059824383" Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.258936 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.264050 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:04:53 crc kubenswrapper[4696]: I1202 23:04:53.265424 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:04:54 crc kubenswrapper[4696]: I1202 23:04:54.136536 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.190892 4696 generic.go:334] "Generic (PLEG): container finished" podID="c7cf934c-9238-4692-90db-19e7faaf7bfd" containerID="a430826aabb5ed2b42223d8e21ce2e5431e66f139d24f0bf3f66024819591f93" exitCode=137 Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.191754 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c7cf934c-9238-4692-90db-19e7faaf7bfd","Type":"ContainerDied","Data":"a430826aabb5ed2b42223d8e21ce2e5431e66f139d24f0bf3f66024819591f93"} Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.308399 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.363855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle\") pod \"c7cf934c-9238-4692-90db-19e7faaf7bfd\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.364201 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m5k9\" (UniqueName: \"kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9\") pod \"c7cf934c-9238-4692-90db-19e7faaf7bfd\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.364257 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data\") pod \"c7cf934c-9238-4692-90db-19e7faaf7bfd\" (UID: \"c7cf934c-9238-4692-90db-19e7faaf7bfd\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.371799 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9" (OuterVolumeSpecName: "kube-api-access-4m5k9") pod "c7cf934c-9238-4692-90db-19e7faaf7bfd" (UID: "c7cf934c-9238-4692-90db-19e7faaf7bfd"). InnerVolumeSpecName "kube-api-access-4m5k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.400638 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7cf934c-9238-4692-90db-19e7faaf7bfd" (UID: "c7cf934c-9238-4692-90db-19e7faaf7bfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.402349 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data" (OuterVolumeSpecName: "config-data") pod "c7cf934c-9238-4692-90db-19e7faaf7bfd" (UID: "c7cf934c-9238-4692-90db-19e7faaf7bfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.465818 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.465854 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m5k9\" (UniqueName: \"kubernetes.io/projected/c7cf934c-9238-4692-90db-19e7faaf7bfd-kube-api-access-4m5k9\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.465867 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cf934c-9238-4692-90db-19e7faaf7bfd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.538453 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.567587 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.567668 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.567871 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.568039 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4nqp\" (UniqueName: \"kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.568108 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.568301 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.568365 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle\") pod \"30aba68b-e85d-42d1-819a-2b8cb0886659\" (UID: \"30aba68b-e85d-42d1-819a-2b8cb0886659\") " Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.568582 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.569050 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.572053 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.575933 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp" (OuterVolumeSpecName: "kube-api-access-z4nqp") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "kube-api-access-z4nqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.581844 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts" (OuterVolumeSpecName: "scripts") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.612362 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.670416 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4nqp\" (UniqueName: \"kubernetes.io/projected/30aba68b-e85d-42d1-819a-2b8cb0886659-kube-api-access-z4nqp\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.670584 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.670656 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30aba68b-e85d-42d1-819a-2b8cb0886659-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.670724 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.670450 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.724968 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data" (OuterVolumeSpecName: "config-data") pod "30aba68b-e85d-42d1-819a-2b8cb0886659" (UID: "30aba68b-e85d-42d1-819a-2b8cb0886659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.773804 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:57 crc kubenswrapper[4696]: I1202 23:04:57.774399 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aba68b-e85d-42d1-819a-2b8cb0886659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.208057 4696 generic.go:334] "Generic (PLEG): container finished" podID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerID="e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002" exitCode=0 Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.208177 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerDied","Data":"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002"} Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.208179 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.208212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30aba68b-e85d-42d1-819a-2b8cb0886659","Type":"ContainerDied","Data":"9ff4a1e8e3a4b226aa43cba6435d5ed07dcfaf0b2d04b37fa3787e4e1ff1af77"} Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.208236 4696 scope.go:117] "RemoveContainer" containerID="74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.212958 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c7cf934c-9238-4692-90db-19e7faaf7bfd","Type":"ContainerDied","Data":"de5811578f1e94753f11beee6f50b3390860c0b9c3c440fc4ff2a74ebd927d52"} Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.213039 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.244984 4696 scope.go:117] "RemoveContainer" containerID="f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.253214 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.272896 4696 scope.go:117] "RemoveContainer" containerID="e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.275419 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.309931 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322277 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.322768 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="proxy-httpd" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322785 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="proxy-httpd" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.322805 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-notification-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322814 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-notification-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.322823 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-central-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322830 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-central-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.322845 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="sg-core" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322851 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="sg-core" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.322864 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf934c-9238-4692-90db-19e7faaf7bfd" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.322870 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf934c-9238-4692-90db-19e7faaf7bfd" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323079 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-notification-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323097 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="ceilometer-central-agent" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323109 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf934c-9238-4692-90db-19e7faaf7bfd" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323123 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="sg-core" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323137 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" containerName="proxy-httpd" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.323909 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.332299 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.332516 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.339095 4696 scope.go:117] "RemoveContainer" containerID="ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.339252 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.353499 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.373366 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.395948 4696 scope.go:117] "RemoveContainer" containerID="74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.396079 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.398812 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.399951 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e\": container with ID starting with 74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e not found: ID does not exist" containerID="74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.400022 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e"} err="failed to get container status \"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e\": rpc error: code = NotFound desc = could not find container \"74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e\": container with ID starting with 74180b5dcde2ced35d927219adfa4797a5e58878a6cf9289c0951faee983021e not found: ID does not exist" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.400059 4696 scope.go:117] "RemoveContainer" containerID="f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.402492 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krjlk\" (UniqueName: \"kubernetes.io/projected/2ed66bd7-4f5d-4501-b81f-51939db42c64-kube-api-access-krjlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.402583 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.402623 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.402671 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.402774 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.404869 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.410101 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.411055 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.411379 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.411576 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650\": container with ID starting with f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650 not found: ID does not exist" containerID="f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.411620 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.411611 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650"} err="failed to get container status \"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650\": rpc error: code = NotFound desc = could not find container \"f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650\": container with ID starting with f302876a7ba26ca764a9dfa0d77841be81b62953da5f57027c5846495af15650 not found: ID does not exist" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.411645 4696 scope.go:117] "RemoveContainer" containerID="e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.413353 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.416933 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002\": container with ID starting with e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002 not found: ID does not exist" containerID="e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.416993 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002"} err="failed to get container status \"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002\": rpc error: code = NotFound desc = could not find container \"e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002\": container with ID starting with e7b7779ebf9f81366c8b85d425c4b48d4b5c7513fbedfd3340efc3063e33f002 not found: ID does not exist" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.417073 4696 scope.go:117] "RemoveContainer" containerID="ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9" Dec 02 23:04:58 crc kubenswrapper[4696]: E1202 23:04:58.420891 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9\": container with ID starting with ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9 not found: ID does not exist" containerID="ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.420961 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9"} err="failed to get container status \"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9\": rpc error: code = NotFound desc = could not find container \"ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9\": container with ID starting with ef4edf254ee03442f99cbf1467dfd884e0f982edb02e4346a92264c5c65dd8c9 not found: ID does not exist" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.420994 4696 scope.go:117] "RemoveContainer" containerID="a430826aabb5ed2b42223d8e21ce2e5431e66f139d24f0bf3f66024819591f93" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.423100 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.490889 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506570 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506657 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506683 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506751 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506808 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcjj\" (UniqueName: \"kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506843 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506881 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506955 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.506975 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.507003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.507057 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krjlk\" (UniqueName: \"kubernetes.io/projected/2ed66bd7-4f5d-4501-b81f-51939db42c64-kube-api-access-krjlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.507083 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.523343 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.549935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.568446 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.569105 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed66bd7-4f5d-4501-b81f-51939db42c64-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.577472 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krjlk\" (UniqueName: \"kubernetes.io/projected/2ed66bd7-4f5d-4501-b81f-51939db42c64-kube-api-access-krjlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ed66bd7-4f5d-4501-b81f-51939db42c64\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610129 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xcjj\" (UniqueName: \"kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610202 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610238 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610259 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610282 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610320 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610389 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.610404 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.611140 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.611379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.625233 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.632890 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.633669 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.633673 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.647660 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xcjj\" (UniqueName: \"kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.649053 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts\") pod \"ceilometer-0\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " pod="openstack/ceilometer-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.656332 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:04:58 crc kubenswrapper[4696]: I1202 23:04:58.732329 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.227295 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.235013 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.259669 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.343254 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.499659 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30aba68b-e85d-42d1-819a-2b8cb0886659" path="/var/lib/kubelet/pods/30aba68b-e85d-42d1-819a-2b8cb0886659/volumes" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.501128 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7cf934c-9238-4692-90db-19e7faaf7bfd" path="/var/lib/kubelet/pods/c7cf934c-9238-4692-90db-19e7faaf7bfd/volumes" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.507937 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.509661 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.509820 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548614 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548692 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548763 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmm4h\" (UniqueName: \"kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548797 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548831 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.548882 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.650790 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.650910 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.650943 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.650975 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmm4h\" (UniqueName: \"kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.650996 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.651028 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.652125 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.652889 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.652910 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.652938 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.653261 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.675576 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmm4h\" (UniqueName: \"kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h\") pod \"dnsmasq-dns-cd5cbd7b9-vsrpt\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:04:59 crc kubenswrapper[4696]: I1202 23:04:59.860717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:05:00 crc kubenswrapper[4696]: I1202 23:05:00.241550 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ed66bd7-4f5d-4501-b81f-51939db42c64","Type":"ContainerStarted","Data":"22c190baeaf28dc08554ae3dae3b6b8de8e549b6b728cb35cd5dcda2c1ccb8e8"} Dec 02 23:05:00 crc kubenswrapper[4696]: I1202 23:05:00.242207 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ed66bd7-4f5d-4501-b81f-51939db42c64","Type":"ContainerStarted","Data":"bd8ab757930841ebbd4fc36c6aaee068948aa87d1d41408d4690649180bbc036"} Dec 02 23:05:00 crc kubenswrapper[4696]: I1202 23:05:00.246462 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerStarted","Data":"c33db35aaf89906a7e9d9390d773fdd4fab695d387b59a56a68c64439f80a5b4"} Dec 02 23:05:00 crc kubenswrapper[4696]: I1202 23:05:00.278480 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.278452762 podStartE2EDuration="2.278452762s" podCreationTimestamp="2025-12-02 23:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:00.261568772 +0000 UTC m=+1363.142248773" watchObservedRunningTime="2025-12-02 23:05:00.278452762 +0000 UTC m=+1363.159132783" Dec 02 23:05:00 crc kubenswrapper[4696]: I1202 23:05:00.423341 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:05:01 crc kubenswrapper[4696]: I1202 23:05:01.262871 4696 generic.go:334] "Generic (PLEG): container finished" podID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerID="3c522f0b519aed36cf98228fddf6a68e6aefc87af07774c130352064deda1501" exitCode=0 Dec 02 23:05:01 crc kubenswrapper[4696]: I1202 23:05:01.265274 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" event={"ID":"5b23e912-6f82-4956-89d0-8074d9dbb121","Type":"ContainerDied","Data":"3c522f0b519aed36cf98228fddf6a68e6aefc87af07774c130352064deda1501"} Dec 02 23:05:01 crc kubenswrapper[4696]: I1202 23:05:01.265310 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" event={"ID":"5b23e912-6f82-4956-89d0-8074d9dbb121","Type":"ContainerStarted","Data":"22f6cb954bad252ad12ddca2859c5959d5f9a0994329c9b930304c9ba05dfd0c"} Dec 02 23:05:01 crc kubenswrapper[4696]: I1202 23:05:01.298206 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerStarted","Data":"e6e96e4b3d9b9ff2b054eb9a3e734b0062b380099e86bf726c020422d3fa97ad"} Dec 02 23:05:01 crc kubenswrapper[4696]: I1202 23:05:01.497374 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.247061 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.312421 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" event={"ID":"5b23e912-6f82-4956-89d0-8074d9dbb121","Type":"ContainerStarted","Data":"513fe9cb0db20d1e0524d1e76bbec17f611f4f100bad06222a375baccd4a89a9"} Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.312593 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.322280 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerStarted","Data":"d434f115f0286e4cf884ab4b65eaa08c14f8f841214c0eaf133902d37b3d41e3"} Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.322362 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerStarted","Data":"9d14eb5358a649b1d7bcdd91eddd8c0a1dc8a8ac449adfa03105350237d6730c"} Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.322498 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-log" containerID="cri-o://e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4" gracePeriod=30 Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.322571 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-api" containerID="cri-o://e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0" gracePeriod=30 Dec 02 23:05:02 crc kubenswrapper[4696]: I1202 23:05:02.339032 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" podStartSLOduration=3.33901393 podStartE2EDuration="3.33901393s" podCreationTimestamp="2025-12-02 23:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:02.335207252 +0000 UTC m=+1365.215887253" watchObservedRunningTime="2025-12-02 23:05:02.33901393 +0000 UTC m=+1365.219693931" Dec 02 23:05:03 crc kubenswrapper[4696]: I1202 23:05:03.338321 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5c4775b-133c-454d-bdbc-8435912faecb" containerID="e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4" exitCode=143 Dec 02 23:05:03 crc kubenswrapper[4696]: I1202 23:05:03.338603 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerDied","Data":"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4"} Dec 02 23:05:03 crc kubenswrapper[4696]: I1202 23:05:03.605553 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:03 crc kubenswrapper[4696]: I1202 23:05:03.657849 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.349809 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerStarted","Data":"93c71f9730f552492ed1944556fe0313ba6009d8332b19cd4907b825380f5238"} Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.350043 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-central-agent" containerID="cri-o://e6e96e4b3d9b9ff2b054eb9a3e734b0062b380099e86bf726c020422d3fa97ad" gracePeriod=30 Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.350085 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="sg-core" containerID="cri-o://d434f115f0286e4cf884ab4b65eaa08c14f8f841214c0eaf133902d37b3d41e3" gracePeriod=30 Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.350157 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-notification-agent" containerID="cri-o://9d14eb5358a649b1d7bcdd91eddd8c0a1dc8a8ac449adfa03105350237d6730c" gracePeriod=30 Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.350061 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="proxy-httpd" containerID="cri-o://93c71f9730f552492ed1944556fe0313ba6009d8332b19cd4907b825380f5238" gracePeriod=30 Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.350336 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:05:04 crc kubenswrapper[4696]: I1202 23:05:04.393914 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.305902902 podStartE2EDuration="6.393893256s" podCreationTimestamp="2025-12-02 23:04:58 +0000 UTC" firstStartedPulling="2025-12-02 23:04:59.370806499 +0000 UTC m=+1362.251486500" lastFinishedPulling="2025-12-02 23:05:03.458796863 +0000 UTC m=+1366.339476854" observedRunningTime="2025-12-02 23:05:04.384119718 +0000 UTC m=+1367.264799719" watchObservedRunningTime="2025-12-02 23:05:04.393893256 +0000 UTC m=+1367.274573257" Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.365614 4696 generic.go:334] "Generic (PLEG): container finished" podID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerID="93c71f9730f552492ed1944556fe0313ba6009d8332b19cd4907b825380f5238" exitCode=0 Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.366163 4696 generic.go:334] "Generic (PLEG): container finished" podID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerID="d434f115f0286e4cf884ab4b65eaa08c14f8f841214c0eaf133902d37b3d41e3" exitCode=2 Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.366177 4696 generic.go:334] "Generic (PLEG): container finished" podID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerID="9d14eb5358a649b1d7bcdd91eddd8c0a1dc8a8ac449adfa03105350237d6730c" exitCode=0 Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.365678 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerDied","Data":"93c71f9730f552492ed1944556fe0313ba6009d8332b19cd4907b825380f5238"} Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.366224 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerDied","Data":"d434f115f0286e4cf884ab4b65eaa08c14f8f841214c0eaf133902d37b3d41e3"} Dec 02 23:05:05 crc kubenswrapper[4696]: I1202 23:05:05.366238 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerDied","Data":"9d14eb5358a649b1d7bcdd91eddd8c0a1dc8a8ac449adfa03105350237d6730c"} Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.099636 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.220110 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data\") pod \"f5c4775b-133c-454d-bdbc-8435912faecb\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.220344 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs\") pod \"f5c4775b-133c-454d-bdbc-8435912faecb\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.220426 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle\") pod \"f5c4775b-133c-454d-bdbc-8435912faecb\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.220462 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds2fc\" (UniqueName: \"kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc\") pod \"f5c4775b-133c-454d-bdbc-8435912faecb\" (UID: \"f5c4775b-133c-454d-bdbc-8435912faecb\") " Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.226603 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs" (OuterVolumeSpecName: "logs") pod "f5c4775b-133c-454d-bdbc-8435912faecb" (UID: "f5c4775b-133c-454d-bdbc-8435912faecb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.235326 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc" (OuterVolumeSpecName: "kube-api-access-ds2fc") pod "f5c4775b-133c-454d-bdbc-8435912faecb" (UID: "f5c4775b-133c-454d-bdbc-8435912faecb"). InnerVolumeSpecName "kube-api-access-ds2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.259243 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data" (OuterVolumeSpecName: "config-data") pod "f5c4775b-133c-454d-bdbc-8435912faecb" (UID: "f5c4775b-133c-454d-bdbc-8435912faecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.270999 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5c4775b-133c-454d-bdbc-8435912faecb" (UID: "f5c4775b-133c-454d-bdbc-8435912faecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.323494 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.323540 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5c4775b-133c-454d-bdbc-8435912faecb-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.323549 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c4775b-133c-454d-bdbc-8435912faecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.323561 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds2fc\" (UniqueName: \"kubernetes.io/projected/f5c4775b-133c-454d-bdbc-8435912faecb-kube-api-access-ds2fc\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.380476 4696 generic.go:334] "Generic (PLEG): container finished" podID="f5c4775b-133c-454d-bdbc-8435912faecb" containerID="e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0" exitCode=0 Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.380535 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerDied","Data":"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0"} Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.380576 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5c4775b-133c-454d-bdbc-8435912faecb","Type":"ContainerDied","Data":"e73b143af00487ec7af0ee04d777345c029f8d4791100549d88506f296a2d09f"} Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.380604 4696 scope.go:117] "RemoveContainer" containerID="e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.380686 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.420425 4696 scope.go:117] "RemoveContainer" containerID="e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.426436 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.444498 4696 scope.go:117] "RemoveContainer" containerID="e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0" Dec 02 23:05:06 crc kubenswrapper[4696]: E1202 23:05:06.445247 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0\": container with ID starting with e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0 not found: ID does not exist" containerID="e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.445306 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0"} err="failed to get container status \"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0\": rpc error: code = NotFound desc = could not find container \"e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0\": container with ID starting with e65f16266e459ea607a47b4ff2fa7b3bfed6f13c98434a44e7a76f01aeeef3b0 not found: ID does not exist" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.445341 4696 scope.go:117] "RemoveContainer" containerID="e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4" Dec 02 23:05:06 crc kubenswrapper[4696]: E1202 23:05:06.446193 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4\": container with ID starting with e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4 not found: ID does not exist" containerID="e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.446226 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4"} err="failed to get container status \"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4\": rpc error: code = NotFound desc = could not find container \"e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4\": container with ID starting with e57961f76d6bc86ac812a7c6c000e62d0588e2ebbc220ed87eb4945406ffb0b4 not found: ID does not exist" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.452575 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.465166 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:06 crc kubenswrapper[4696]: E1202 23:05:06.465779 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-api" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.465796 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-api" Dec 02 23:05:06 crc kubenswrapper[4696]: E1202 23:05:06.465847 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-log" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.465853 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-log" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.466046 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-api" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.466077 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" containerName="nova-api-log" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.467521 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.477204 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.477278 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.477209 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.479590 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.632934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.633028 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.633085 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.633187 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bcn\" (UniqueName: \"kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.633254 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.633373 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735394 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735537 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bcn\" (UniqueName: \"kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735618 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735709 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735829 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.735893 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.736463 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.741668 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.742035 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.744175 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.749271 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.755865 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bcn\" (UniqueName: \"kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn\") pod \"nova-api-0\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " pod="openstack/nova-api-0" Dec 02 23:05:06 crc kubenswrapper[4696]: I1202 23:05:06.796296 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:07 crc kubenswrapper[4696]: I1202 23:05:07.287341 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:07 crc kubenswrapper[4696]: I1202 23:05:07.402703 4696 generic.go:334] "Generic (PLEG): container finished" podID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerID="e6e96e4b3d9b9ff2b054eb9a3e734b0062b380099e86bf726c020422d3fa97ad" exitCode=0 Dec 02 23:05:07 crc kubenswrapper[4696]: I1202 23:05:07.402790 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerDied","Data":"e6e96e4b3d9b9ff2b054eb9a3e734b0062b380099e86bf726c020422d3fa97ad"} Dec 02 23:05:07 crc kubenswrapper[4696]: I1202 23:05:07.404781 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerStarted","Data":"310caa5ae63dce372b8df9993b4be6ea2fcc5d3490debe0c6f8836ab1e1cab13"} Dec 02 23:05:07 crc kubenswrapper[4696]: I1202 23:05:07.447040 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c4775b-133c-454d-bdbc-8435912faecb" path="/var/lib/kubelet/pods/f5c4775b-133c-454d-bdbc-8435912faecb/volumes" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.036983 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177089 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177597 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177709 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177829 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xcjj\" (UniqueName: \"kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177886 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.177920 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.178058 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.178227 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd\") pod \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\" (UID: \"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f\") " Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.178641 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.178850 4696 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.178974 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.183121 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj" (OuterVolumeSpecName: "kube-api-access-7xcjj") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "kube-api-access-7xcjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.184058 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts" (OuterVolumeSpecName: "scripts") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.209598 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.241991 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.277138 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280706 4696 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280764 4696 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280782 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280792 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280801 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xcjj\" (UniqueName: \"kubernetes.io/projected/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-kube-api-access-7xcjj\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.280811 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.305242 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data" (OuterVolumeSpecName: "config-data") pod "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" (UID: "ded8d2ed-8de8-4e49-ade5-aeddbb91f72f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.382772 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.421270 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8d2ed-8de8-4e49-ade5-aeddbb91f72f","Type":"ContainerDied","Data":"c33db35aaf89906a7e9d9390d773fdd4fab695d387b59a56a68c64439f80a5b4"} Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.421337 4696 scope.go:117] "RemoveContainer" containerID="93c71f9730f552492ed1944556fe0313ba6009d8332b19cd4907b825380f5238" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.421466 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.426023 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerStarted","Data":"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99"} Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.426093 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerStarted","Data":"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad"} Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.452432 4696 scope.go:117] "RemoveContainer" containerID="d434f115f0286e4cf884ab4b65eaa08c14f8f841214c0eaf133902d37b3d41e3" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.463547 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.463507817 podStartE2EDuration="2.463507817s" podCreationTimestamp="2025-12-02 23:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:08.449483318 +0000 UTC m=+1371.330163319" watchObservedRunningTime="2025-12-02 23:05:08.463507817 +0000 UTC m=+1371.344187838" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.493374 4696 scope.go:117] "RemoveContainer" containerID="9d14eb5358a649b1d7bcdd91eddd8c0a1dc8a8ac449adfa03105350237d6730c" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.500720 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.519079 4696 scope.go:117] "RemoveContainer" containerID="e6e96e4b3d9b9ff2b054eb9a3e734b0062b380099e86bf726c020422d3fa97ad" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.524613 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.537924 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:08 crc kubenswrapper[4696]: E1202 23:05:08.539427 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-central-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539454 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-central-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: E1202 23:05:08.539472 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="proxy-httpd" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539480 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="proxy-httpd" Dec 02 23:05:08 crc kubenswrapper[4696]: E1202 23:05:08.539499 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-notification-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539508 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-notification-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: E1202 23:05:08.539523 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="sg-core" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539532 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="sg-core" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539755 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-central-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539776 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="ceilometer-notification-agent" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539790 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="sg-core" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.539811 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" containerName="proxy-httpd" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.542331 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.549208 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.549542 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.549668 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.550918 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.660410 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689095 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-scripts\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689276 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-config-data\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689518 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xczk6\" (UniqueName: \"kubernetes.io/projected/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-kube-api-access-xczk6\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689543 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-run-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689695 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.689943 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-log-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.690639 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.791886 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xczk6\" (UniqueName: \"kubernetes.io/projected/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-kube-api-access-xczk6\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.791973 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-run-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792075 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792096 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-log-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792158 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-scripts\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792199 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-config-data\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.792246 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.793392 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-log-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.793930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-run-httpd\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.798534 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.799012 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-config-data\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.801205 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.801784 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-scripts\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.803781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.819901 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xczk6\" (UniqueName: \"kubernetes.io/projected/0947b5ae-aeca-481d-a2b9-3bd3db5a33c0-kube-api-access-xczk6\") pod \"ceilometer-0\" (UID: \"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0\") " pod="openstack/ceilometer-0" Dec 02 23:05:08 crc kubenswrapper[4696]: I1202 23:05:08.872958 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:05:09 crc kubenswrapper[4696]: W1202 23:05:09.372076 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0947b5ae_aeca_481d_a2b9_3bd3db5a33c0.slice/crio-92b3687475f54df448a58075801d9980d8c421a20a7ac9f9ac9059dca5abf6df WatchSource:0}: Error finding container 92b3687475f54df448a58075801d9980d8c421a20a7ac9f9ac9059dca5abf6df: Status 404 returned error can't find the container with id 92b3687475f54df448a58075801d9980d8c421a20a7ac9f9ac9059dca5abf6df Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.375279 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.458922 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded8d2ed-8de8-4e49-ade5-aeddbb91f72f" path="/var/lib/kubelet/pods/ded8d2ed-8de8-4e49-ade5-aeddbb91f72f/volumes" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.459985 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0","Type":"ContainerStarted","Data":"92b3687475f54df448a58075801d9980d8c421a20a7ac9f9ac9059dca5abf6df"} Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.479522 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.709513 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-r7rj2"] Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.711325 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.714147 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.718046 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.725493 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7rj2"] Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.817340 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.817398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.817885 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.818167 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcts\" (UniqueName: \"kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.863036 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.931253 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.931356 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.931682 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.931935 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcts\" (UniqueName: \"kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.947663 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.953639 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.954195 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.954576 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.954903 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="dnsmasq-dns" containerID="cri-o://0c8f9b2aa71f3bd103a176c588f4efab692d5c78dd1a8959c2a9963d73ec4d34" gracePeriod=10 Dec 02 23:05:09 crc kubenswrapper[4696]: I1202 23:05:09.980148 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcts\" (UniqueName: \"kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts\") pod \"nova-cell1-cell-mapping-r7rj2\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.035942 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.480873 4696 generic.go:334] "Generic (PLEG): container finished" podID="a097c9e9-bde7-443d-b08b-74859c58d517" containerID="0c8f9b2aa71f3bd103a176c588f4efab692d5c78dd1a8959c2a9963d73ec4d34" exitCode=0 Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.481033 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" event={"ID":"a097c9e9-bde7-443d-b08b-74859c58d517","Type":"ContainerDied","Data":"0c8f9b2aa71f3bd103a176c588f4efab692d5c78dd1a8959c2a9963d73ec4d34"} Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.502151 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.637734 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7rj2"] Dec 02 23:05:10 crc kubenswrapper[4696]: W1202 23:05:10.638970 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bb6c8ad_6b5e_4f55_8495_c2c30699ab91.slice/crio-6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb WatchSource:0}: Error finding container 6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb: Status 404 returned error can't find the container with id 6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.652833 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5zb6\" (UniqueName: \"kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.652924 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.653006 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.653034 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.653105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.653163 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc\") pod \"a097c9e9-bde7-443d-b08b-74859c58d517\" (UID: \"a097c9e9-bde7-443d-b08b-74859c58d517\") " Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.671538 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6" (OuterVolumeSpecName: "kube-api-access-k5zb6") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "kube-api-access-k5zb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.738466 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.738501 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.739721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.745176 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.758663 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5zb6\" (UniqueName: \"kubernetes.io/projected/a097c9e9-bde7-443d-b08b-74859c58d517-kube-api-access-k5zb6\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.759025 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.759139 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.759479 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.759606 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.759544 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config" (OuterVolumeSpecName: "config") pod "a097c9e9-bde7-443d-b08b-74859c58d517" (UID: "a097c9e9-bde7-443d-b08b-74859c58d517"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:05:10 crc kubenswrapper[4696]: I1202 23:05:10.861772 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a097c9e9-bde7-443d-b08b-74859c58d517-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.508161 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7rj2" event={"ID":"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91","Type":"ContainerStarted","Data":"d578a50080f50ef7f54dce9d91a6f8b7b76cae522bce55baacb541cdf8b17f9a"} Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.508492 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7rj2" event={"ID":"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91","Type":"ContainerStarted","Data":"6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb"} Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.511021 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" event={"ID":"a097c9e9-bde7-443d-b08b-74859c58d517","Type":"ContainerDied","Data":"f562ff40eba809e37e487f839fdcbe2e783e112f508b3b965d814c5f953141bd"} Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.511061 4696 scope.go:117] "RemoveContainer" containerID="0c8f9b2aa71f3bd103a176c588f4efab692d5c78dd1a8959c2a9963d73ec4d34" Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.511103 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-spkxd" Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.513091 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0","Type":"ContainerStarted","Data":"bd30f318583e01a4303242c71a01e43eb6146664fdd1344f52f128460de698dc"} Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.513229 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0","Type":"ContainerStarted","Data":"1becca3a7151709692211c988e0e87ec4a7691c8ff233b506a6695a3250e07ac"} Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.532400 4696 scope.go:117] "RemoveContainer" containerID="ac989ba9a60cf21c24afdb5261960e7a4b32551b8a726082a866e112ba9d62fd" Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.537331 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-r7rj2" podStartSLOduration=2.537303939 podStartE2EDuration="2.537303939s" podCreationTimestamp="2025-12-02 23:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:11.529030184 +0000 UTC m=+1374.409710195" watchObservedRunningTime="2025-12-02 23:05:11.537303939 +0000 UTC m=+1374.417983940" Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.555360 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:05:11 crc kubenswrapper[4696]: I1202 23:05:11.566053 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-spkxd"] Dec 02 23:05:12 crc kubenswrapper[4696]: I1202 23:05:12.533314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0","Type":"ContainerStarted","Data":"295adcd333a1bc8293863f57e97057dad5d70c6f8119e71f2d68a45e7ff41393"} Dec 02 23:05:13 crc kubenswrapper[4696]: I1202 23:05:13.442929 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" path="/var/lib/kubelet/pods/a097c9e9-bde7-443d-b08b-74859c58d517/volumes" Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.583431 4696 generic.go:334] "Generic (PLEG): container finished" podID="0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" containerID="d578a50080f50ef7f54dce9d91a6f8b7b76cae522bce55baacb541cdf8b17f9a" exitCode=0 Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.583563 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7rj2" event={"ID":"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91","Type":"ContainerDied","Data":"d578a50080f50ef7f54dce9d91a6f8b7b76cae522bce55baacb541cdf8b17f9a"} Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.589126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0947b5ae-aeca-481d-a2b9-3bd3db5a33c0","Type":"ContainerStarted","Data":"b6d18e43275ae222956c41ba9b1cf252336ac8334a022232b26ae642ddb235bd"} Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.589488 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.655174 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.79453069 podStartE2EDuration="8.655131559s" podCreationTimestamp="2025-12-02 23:05:08 +0000 UTC" firstStartedPulling="2025-12-02 23:05:09.375192215 +0000 UTC m=+1372.255872206" lastFinishedPulling="2025-12-02 23:05:15.235793034 +0000 UTC m=+1378.116473075" observedRunningTime="2025-12-02 23:05:16.644654522 +0000 UTC m=+1379.525334523" watchObservedRunningTime="2025-12-02 23:05:16.655131559 +0000 UTC m=+1379.535811570" Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.797171 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:05:16 crc kubenswrapper[4696]: I1202 23:05:16.797413 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:05:17 crc kubenswrapper[4696]: I1202 23:05:17.809952 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:17 crc kubenswrapper[4696]: I1202 23:05:17.810091 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.077311 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.246599 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle\") pod \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.246831 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data\") pod \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.246973 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcts\" (UniqueName: \"kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts\") pod \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.247068 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts\") pod \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\" (UID: \"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91\") " Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.254368 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts" (OuterVolumeSpecName: "scripts") pod "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" (UID: "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.254607 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts" (OuterVolumeSpecName: "kube-api-access-4mcts") pod "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" (UID: "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91"). InnerVolumeSpecName "kube-api-access-4mcts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.281733 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" (UID: "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.299380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data" (OuterVolumeSpecName: "config-data") pod "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" (UID: "0bb6c8ad-6b5e-4f55-8495-c2c30699ab91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.349467 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.349526 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mcts\" (UniqueName: \"kubernetes.io/projected/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-kube-api-access-4mcts\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.349540 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.349555 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.635594 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7rj2" event={"ID":"0bb6c8ad-6b5e-4f55-8495-c2c30699ab91","Type":"ContainerDied","Data":"6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb"} Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.635666 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbc9031e05660d0760bdd97ccd95eef8752d58f44c0eb34258058a4ece5aacb" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.635695 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7rj2" Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.834313 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.834603 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-log" containerID="cri-o://9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad" gracePeriod=30 Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.834699 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-api" containerID="cri-o://db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99" gracePeriod=30 Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.869815 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.871116 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" containerID="cri-o://33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" gracePeriod=30 Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.926191 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.926510 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-log" containerID="cri-o://808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033" gracePeriod=30 Dec 02 23:05:18 crc kubenswrapper[4696]: I1202 23:05:18.926697 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-metadata" containerID="cri-o://84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302" gracePeriod=30 Dec 02 23:05:19 crc kubenswrapper[4696]: I1202 23:05:19.649281 4696 generic.go:334] "Generic (PLEG): container finished" podID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerID="9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad" exitCode=143 Dec 02 23:05:19 crc kubenswrapper[4696]: I1202 23:05:19.649427 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerDied","Data":"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad"} Dec 02 23:05:19 crc kubenswrapper[4696]: I1202 23:05:19.651315 4696 generic.go:334] "Generic (PLEG): container finished" podID="8441536b-8b39-4650-8fa9-3573038ffa49" containerID="808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033" exitCode=143 Dec 02 23:05:19 crc kubenswrapper[4696]: I1202 23:05:19.651355 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerDied","Data":"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033"} Dec 02 23:05:20 crc kubenswrapper[4696]: E1202 23:05:20.411139 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:20 crc kubenswrapper[4696]: E1202 23:05:20.413217 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:20 crc kubenswrapper[4696]: E1202 23:05:20.414406 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:20 crc kubenswrapper[4696]: E1202 23:05:20.414528 4696 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.581538 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.650751 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ngfp\" (UniqueName: \"kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp\") pod \"8441536b-8b39-4650-8fa9-3573038ffa49\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.650811 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data\") pod \"8441536b-8b39-4650-8fa9-3573038ffa49\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.650833 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs\") pod \"8441536b-8b39-4650-8fa9-3573038ffa49\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.651064 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs\") pod \"8441536b-8b39-4650-8fa9-3573038ffa49\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.651105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle\") pod \"8441536b-8b39-4650-8fa9-3573038ffa49\" (UID: \"8441536b-8b39-4650-8fa9-3573038ffa49\") " Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.652309 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs" (OuterVolumeSpecName: "logs") pod "8441536b-8b39-4650-8fa9-3573038ffa49" (UID: "8441536b-8b39-4650-8fa9-3573038ffa49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.652528 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8441536b-8b39-4650-8fa9-3573038ffa49-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.667440 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp" (OuterVolumeSpecName: "kube-api-access-9ngfp") pod "8441536b-8b39-4650-8fa9-3573038ffa49" (UID: "8441536b-8b39-4650-8fa9-3573038ffa49"). InnerVolumeSpecName "kube-api-access-9ngfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.696177 4696 generic.go:334] "Generic (PLEG): container finished" podID="8441536b-8b39-4650-8fa9-3573038ffa49" containerID="84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302" exitCode=0 Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.696240 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerDied","Data":"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302"} Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.696280 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8441536b-8b39-4650-8fa9-3573038ffa49","Type":"ContainerDied","Data":"545cabcad2ae396cc2a6bcfba0ea06ade06cba66a556d9b3eea64f328b075afd"} Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.696305 4696 scope.go:117] "RemoveContainer" containerID="84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.696506 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.713765 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8441536b-8b39-4650-8fa9-3573038ffa49" (UID: "8441536b-8b39-4650-8fa9-3573038ffa49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.735579 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8441536b-8b39-4650-8fa9-3573038ffa49" (UID: "8441536b-8b39-4650-8fa9-3573038ffa49"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.754130 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.754171 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ngfp\" (UniqueName: \"kubernetes.io/projected/8441536b-8b39-4650-8fa9-3573038ffa49-kube-api-access-9ngfp\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.754186 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.757211 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data" (OuterVolumeSpecName: "config-data") pod "8441536b-8b39-4650-8fa9-3573038ffa49" (UID: "8441536b-8b39-4650-8fa9-3573038ffa49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.785798 4696 scope.go:117] "RemoveContainer" containerID="808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.857734 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8441536b-8b39-4650-8fa9-3573038ffa49-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.883026 4696 scope.go:117] "RemoveContainer" containerID="84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302" Dec 02 23:05:22 crc kubenswrapper[4696]: E1202 23:05:22.883638 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302\": container with ID starting with 84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302 not found: ID does not exist" containerID="84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.883695 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302"} err="failed to get container status \"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302\": rpc error: code = NotFound desc = could not find container \"84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302\": container with ID starting with 84a1b63e79991ac85f68c5e9773e8a54b0e4b733fec2233174160919474f0302 not found: ID does not exist" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.883755 4696 scope.go:117] "RemoveContainer" containerID="808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033" Dec 02 23:05:22 crc kubenswrapper[4696]: E1202 23:05:22.884152 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033\": container with ID starting with 808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033 not found: ID does not exist" containerID="808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033" Dec 02 23:05:22 crc kubenswrapper[4696]: I1202 23:05:22.884191 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033"} err="failed to get container status \"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033\": rpc error: code = NotFound desc = could not find container \"808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033\": container with ID starting with 808853f58cebf2446af333cdab24858f4a06ed675db45a6905cfe420059d4033 not found: ID does not exist" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.041816 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.053176 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.091850 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.092505 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="dnsmasq-dns" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092534 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="dnsmasq-dns" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.092555 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-log" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092565 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-log" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.092587 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" containerName="nova-manage" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092596 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" containerName="nova-manage" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.092629 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="init" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092637 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="init" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.092663 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-metadata" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092670 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-metadata" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092879 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" containerName="nova-manage" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092892 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a097c9e9-bde7-443d-b08b-74859c58d517" containerName="dnsmasq-dns" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092910 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-log" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.092930 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" containerName="nova-metadata-metadata" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.094324 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.101025 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.101025 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.111680 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.170699 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60554cd9-644e-40c0-90c9-57610b92846e-logs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.170830 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.170887 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9f7\" (UniqueName: \"kubernetes.io/projected/60554cd9-644e-40c0-90c9-57610b92846e-kube-api-access-sn9f7\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.171014 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.171112 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-config-data\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.273806 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9f7\" (UniqueName: \"kubernetes.io/projected/60554cd9-644e-40c0-90c9-57610b92846e-kube-api-access-sn9f7\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.273953 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.274040 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-config-data\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.274140 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60554cd9-644e-40c0-90c9-57610b92846e-logs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.274193 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.274792 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60554cd9-644e-40c0-90c9-57610b92846e-logs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.279801 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.279850 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-config-data\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.280310 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60554cd9-644e-40c0-90c9-57610b92846e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.293991 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9f7\" (UniqueName: \"kubernetes.io/projected/60554cd9-644e-40c0-90c9-57610b92846e-kube-api-access-sn9f7\") pod \"nova-metadata-0\" (UID: \"60554cd9-644e-40c0-90c9-57610b92846e\") " pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.419960 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.453849 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8441536b-8b39-4650-8fa9-3573038ffa49" path="/var/lib/kubelet/pods/8441536b-8b39-4650-8fa9-3573038ffa49/volumes" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.716222 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.731253 4696 generic.go:334] "Generic (PLEG): container finished" podID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerID="db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99" exitCode=0 Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.731390 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.731429 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerDied","Data":"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99"} Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.732062 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd25eba-1ef0-46ce-8340-4f342fe56530","Type":"ContainerDied","Data":"310caa5ae63dce372b8df9993b4be6ea2fcc5d3490debe0c6f8836ab1e1cab13"} Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.732107 4696 scope.go:117] "RemoveContainer" containerID="db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.769564 4696 scope.go:117] "RemoveContainer" containerID="9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788027 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788101 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bcn\" (UniqueName: \"kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788277 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788372 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.788561 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs\") pod \"2cd25eba-1ef0-46ce-8340-4f342fe56530\" (UID: \"2cd25eba-1ef0-46ce-8340-4f342fe56530\") " Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.789520 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs" (OuterVolumeSpecName: "logs") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.789959 4696 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd25eba-1ef0-46ce-8340-4f342fe56530-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.794198 4696 scope.go:117] "RemoveContainer" containerID="db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.794952 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99\": container with ID starting with db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99 not found: ID does not exist" containerID="db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.794997 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99"} err="failed to get container status \"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99\": rpc error: code = NotFound desc = could not find container \"db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99\": container with ID starting with db065061f0ccbdd2fed3ee9b286deb800b688c695e835e73822dff9d457d5a99 not found: ID does not exist" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.795029 4696 scope.go:117] "RemoveContainer" containerID="9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad" Dec 02 23:05:23 crc kubenswrapper[4696]: E1202 23:05:23.795409 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad\": container with ID starting with 9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad not found: ID does not exist" containerID="9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.795450 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad"} err="failed to get container status \"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad\": rpc error: code = NotFound desc = could not find container \"9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad\": container with ID starting with 9b6c1c68762b2d78489edfe6b5ec93492171c3ea620041d7387c377dca5d50ad not found: ID does not exist" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.800386 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn" (OuterVolumeSpecName: "kube-api-access-j2bcn") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "kube-api-access-j2bcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.828862 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data" (OuterVolumeSpecName: "config-data") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.832612 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.860834 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.865332 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2cd25eba-1ef0-46ce-8340-4f342fe56530" (UID: "2cd25eba-1ef0-46ce-8340-4f342fe56530"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.891780 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.892067 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.892163 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bcn\" (UniqueName: \"kubernetes.io/projected/2cd25eba-1ef0-46ce-8340-4f342fe56530-kube-api-access-j2bcn\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.892233 4696 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:23 crc kubenswrapper[4696]: I1202 23:05:23.892300 4696 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd25eba-1ef0-46ce-8340-4f342fe56530-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:24 crc kubenswrapper[4696]: W1202 23:05:24.036500 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60554cd9_644e_40c0_90c9_57610b92846e.slice/crio-fa8822e4bb33011530dada6e3bc44a632371a1c92118fde893a616b7367ea6eb WatchSource:0}: Error finding container fa8822e4bb33011530dada6e3bc44a632371a1c92118fde893a616b7367ea6eb: Status 404 returned error can't find the container with id fa8822e4bb33011530dada6e3bc44a632371a1c92118fde893a616b7367ea6eb Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.037777 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.086132 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.102808 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.120802 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:24 crc kubenswrapper[4696]: E1202 23:05:24.121481 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-api" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.121498 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-api" Dec 02 23:05:24 crc kubenswrapper[4696]: E1202 23:05:24.121515 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-log" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.121524 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-log" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.121774 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-log" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.121799 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" containerName="nova-api-api" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.123097 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.127118 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.127288 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.128862 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.140055 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.199625 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.199683 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.199713 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.199766 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-config-data\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.200092 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5t8\" (UniqueName: \"kubernetes.io/projected/6cb033f1-9348-4822-b022-daef2e06af49-kube-api-access-ck5t8\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.200568 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb033f1-9348-4822-b022-daef2e06af49-logs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.303031 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb033f1-9348-4822-b022-daef2e06af49-logs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.303163 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.303232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.303514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb033f1-9348-4822-b022-daef2e06af49-logs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.303268 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.304407 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-config-data\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.304489 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5t8\" (UniqueName: \"kubernetes.io/projected/6cb033f1-9348-4822-b022-daef2e06af49-kube-api-access-ck5t8\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.308707 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.309852 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.318051 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.319573 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb033f1-9348-4822-b022-daef2e06af49-config-data\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.323661 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5t8\" (UniqueName: \"kubernetes.io/projected/6cb033f1-9348-4822-b022-daef2e06af49-kube-api-access-ck5t8\") pod \"nova-api-0\" (UID: \"6cb033f1-9348-4822-b022-daef2e06af49\") " pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.465055 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.757662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60554cd9-644e-40c0-90c9-57610b92846e","Type":"ContainerStarted","Data":"46f0b71e0917f2d30f43ee03bf883fb7317d864ac33d6a3e5281ff993e6d1baa"} Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.758352 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60554cd9-644e-40c0-90c9-57610b92846e","Type":"ContainerStarted","Data":"fa8822e4bb33011530dada6e3bc44a632371a1c92118fde893a616b7367ea6eb"} Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.762328 4696 generic.go:334] "Generic (PLEG): container finished" podID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" exitCode=0 Dec 02 23:05:24 crc kubenswrapper[4696]: I1202 23:05:24.762440 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1e7bee2-7394-4980-9f24-e50448cda21a","Type":"ContainerDied","Data":"33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2"} Dec 02 23:05:25 crc kubenswrapper[4696]: I1202 23:05:25.007626 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:05:25 crc kubenswrapper[4696]: E1202 23:05:25.411321 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2 is running failed: container process not found" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:25 crc kubenswrapper[4696]: E1202 23:05:25.412611 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2 is running failed: container process not found" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:25 crc kubenswrapper[4696]: E1202 23:05:25.413081 4696 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2 is running failed: container process not found" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:05:25 crc kubenswrapper[4696]: E1202 23:05:25.413137 4696 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" Dec 02 23:05:25 crc kubenswrapper[4696]: I1202 23:05:25.446930 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd25eba-1ef0-46ce-8340-4f342fe56530" path="/var/lib/kubelet/pods/2cd25eba-1ef0-46ce-8340-4f342fe56530/volumes" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.647112 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.735205 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle\") pod \"c1e7bee2-7394-4980-9f24-e50448cda21a\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.735815 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data\") pod \"c1e7bee2-7394-4980-9f24-e50448cda21a\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.735947 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq2ph\" (UniqueName: \"kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph\") pod \"c1e7bee2-7394-4980-9f24-e50448cda21a\" (UID: \"c1e7bee2-7394-4980-9f24-e50448cda21a\") " Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.741421 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph" (OuterVolumeSpecName: "kube-api-access-qq2ph") pod "c1e7bee2-7394-4980-9f24-e50448cda21a" (UID: "c1e7bee2-7394-4980-9f24-e50448cda21a"). InnerVolumeSpecName "kube-api-access-qq2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.768092 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data" (OuterVolumeSpecName: "config-data") pod "c1e7bee2-7394-4980-9f24-e50448cda21a" (UID: "c1e7bee2-7394-4980-9f24-e50448cda21a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.801816 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1e7bee2-7394-4980-9f24-e50448cda21a" (UID: "c1e7bee2-7394-4980-9f24-e50448cda21a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.803121 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60554cd9-644e-40c0-90c9-57610b92846e","Type":"ContainerStarted","Data":"60347d2cd47edb9092676033bd8f05fcb2f2a026e5e9c337c848136d2d8c1498"} Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.810536 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cb033f1-9348-4822-b022-daef2e06af49","Type":"ContainerStarted","Data":"9b660acdf4c091748231d213e7239bce5ea9517a52d166434e1437888abce7e2"} Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.810619 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cb033f1-9348-4822-b022-daef2e06af49","Type":"ContainerStarted","Data":"6fc7af156a2d26a6cd484bfcbaf3175cb2aad05cfe174a16538b0f052a9ff72f"} Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.810632 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cb033f1-9348-4822-b022-daef2e06af49","Type":"ContainerStarted","Data":"33ef2ac3db94a442b63bd124183fbbdbdf20b99045aa3aab1c3517832ce7ba3c"} Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.817109 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1e7bee2-7394-4980-9f24-e50448cda21a","Type":"ContainerDied","Data":"fd1bf7d79edfa1ceafc14004e346c5b9c7558129ee3050aaa28f92554097e0f0"} Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.817582 4696 scope.go:117] "RemoveContainer" containerID="33c333a603b93067c6fa6930dee826ee4c55e1cfae229e32f68d4d4eb427b0d2" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.817322 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.832604 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.832575625 podStartE2EDuration="2.832575625s" podCreationTimestamp="2025-12-02 23:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:25.832241586 +0000 UTC m=+1388.712921587" watchObservedRunningTime="2025-12-02 23:05:25.832575625 +0000 UTC m=+1388.713255636" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.853693 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.853784 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e7bee2-7394-4980-9f24-e50448cda21a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.853800 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq2ph\" (UniqueName: \"kubernetes.io/projected/c1e7bee2-7394-4980-9f24-e50448cda21a-kube-api-access-qq2ph\") on node \"crc\" DevicePath \"\"" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.883569 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.883547424 podStartE2EDuration="1.883547424s" podCreationTimestamp="2025-12-02 23:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:25.870364109 +0000 UTC m=+1388.751044110" watchObservedRunningTime="2025-12-02 23:05:25.883547424 +0000 UTC m=+1388.764227425" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.895472 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.905575 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.915230 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:26 crc kubenswrapper[4696]: E1202 23:05:25.915736 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.915766 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.916053 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" containerName="nova-scheduler-scheduler" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.916860 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.927853 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:25.929069 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.058155 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-config-data\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.058248 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prv2\" (UniqueName: \"kubernetes.io/projected/90b9eaeb-1865-4418-8665-1e65f0fb8151-kube-api-access-4prv2\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.058950 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.162143 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.162294 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-config-data\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.162337 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4prv2\" (UniqueName: \"kubernetes.io/projected/90b9eaeb-1865-4418-8665-1e65f0fb8151-kube-api-access-4prv2\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.166169 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-config-data\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.167991 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b9eaeb-1865-4418-8665-1e65f0fb8151-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.182111 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prv2\" (UniqueName: \"kubernetes.io/projected/90b9eaeb-1865-4418-8665-1e65f0fb8151-kube-api-access-4prv2\") pod \"nova-scheduler-0\" (UID: \"90b9eaeb-1865-4418-8665-1e65f0fb8151\") " pod="openstack/nova-scheduler-0" Dec 02 23:05:26 crc kubenswrapper[4696]: I1202 23:05:26.241308 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:05:27 crc kubenswrapper[4696]: I1202 23:05:27.237446 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:05:27 crc kubenswrapper[4696]: W1202 23:05:27.250141 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b9eaeb_1865_4418_8665_1e65f0fb8151.slice/crio-6617c42217ca085f99d8d443a4ad7be84e8f5adc16e5bb2ffb0bd8b796a9a615 WatchSource:0}: Error finding container 6617c42217ca085f99d8d443a4ad7be84e8f5adc16e5bb2ffb0bd8b796a9a615: Status 404 returned error can't find the container with id 6617c42217ca085f99d8d443a4ad7be84e8f5adc16e5bb2ffb0bd8b796a9a615 Dec 02 23:05:27 crc kubenswrapper[4696]: I1202 23:05:27.445202 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e7bee2-7394-4980-9f24-e50448cda21a" path="/var/lib/kubelet/pods/c1e7bee2-7394-4980-9f24-e50448cda21a/volumes" Dec 02 23:05:27 crc kubenswrapper[4696]: I1202 23:05:27.844689 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"90b9eaeb-1865-4418-8665-1e65f0fb8151","Type":"ContainerStarted","Data":"ab769d9e618f3b9ebdf5b0a0de40680a8f5c05fdb7c0bf2cb817047124f15d2b"} Dec 02 23:05:27 crc kubenswrapper[4696]: I1202 23:05:27.845250 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"90b9eaeb-1865-4418-8665-1e65f0fb8151","Type":"ContainerStarted","Data":"6617c42217ca085f99d8d443a4ad7be84e8f5adc16e5bb2ffb0bd8b796a9a615"} Dec 02 23:05:27 crc kubenswrapper[4696]: I1202 23:05:27.868250 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.868211775 podStartE2EDuration="2.868211775s" podCreationTimestamp="2025-12-02 23:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:05:27.863347577 +0000 UTC m=+1390.744027608" watchObservedRunningTime="2025-12-02 23:05:27.868211775 +0000 UTC m=+1390.748891806" Dec 02 23:05:28 crc kubenswrapper[4696]: I1202 23:05:28.421914 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:05:28 crc kubenswrapper[4696]: I1202 23:05:28.421986 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:05:31 crc kubenswrapper[4696]: I1202 23:05:31.242219 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:05:33 crc kubenswrapper[4696]: I1202 23:05:33.421959 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:05:33 crc kubenswrapper[4696]: I1202 23:05:33.422825 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:05:34 crc kubenswrapper[4696]: I1202 23:05:34.437042 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="60554cd9-644e-40c0-90c9-57610b92846e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:34 crc kubenswrapper[4696]: I1202 23:05:34.450967 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="60554cd9-644e-40c0-90c9-57610b92846e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:34 crc kubenswrapper[4696]: I1202 23:05:34.465546 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:05:34 crc kubenswrapper[4696]: I1202 23:05:34.465608 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:05:35 crc kubenswrapper[4696]: I1202 23:05:35.487100 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6cb033f1-9348-4822-b022-daef2e06af49" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:35 crc kubenswrapper[4696]: I1202 23:05:35.487087 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6cb033f1-9348-4822-b022-daef2e06af49" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:05:36 crc kubenswrapper[4696]: I1202 23:05:36.241992 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:05:36 crc kubenswrapper[4696]: I1202 23:05:36.297342 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:05:37 crc kubenswrapper[4696]: I1202 23:05:37.034548 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:05:38 crc kubenswrapper[4696]: I1202 23:05:38.882441 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.338871 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.342006 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.355278 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.379620 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.380125 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.380217 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kcff\" (UniqueName: \"kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.482366 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.482435 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kcff\" (UniqueName: \"kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.482518 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.483080 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.483332 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.510294 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kcff\" (UniqueName: \"kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff\") pod \"redhat-operators-pwbvq\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:39 crc kubenswrapper[4696]: I1202 23:05:39.665213 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:40 crc kubenswrapper[4696]: I1202 23:05:40.221142 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:05:41 crc kubenswrapper[4696]: I1202 23:05:41.031443 4696 generic.go:334] "Generic (PLEG): container finished" podID="5e798188-37b3-4546-b39c-2f6119b56a50" containerID="1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd" exitCode=0 Dec 02 23:05:41 crc kubenswrapper[4696]: I1202 23:05:41.031519 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerDied","Data":"1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd"} Dec 02 23:05:41 crc kubenswrapper[4696]: I1202 23:05:41.031892 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerStarted","Data":"66258bb8da24f990ed7cdf6fb1b1f94d6644c15dd8b2753e41fc5431f0fa4588"} Dec 02 23:05:42 crc kubenswrapper[4696]: I1202 23:05:42.054661 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerStarted","Data":"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a"} Dec 02 23:05:43 crc kubenswrapper[4696]: I1202 23:05:43.482319 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:05:43 crc kubenswrapper[4696]: I1202 23:05:43.482966 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:05:43 crc kubenswrapper[4696]: I1202 23:05:43.676306 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:05:43 crc kubenswrapper[4696]: I1202 23:05:43.709715 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.096719 4696 generic.go:334] "Generic (PLEG): container finished" podID="5e798188-37b3-4546-b39c-2f6119b56a50" containerID="6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a" exitCode=0 Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.096897 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerDied","Data":"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a"} Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.478668 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.479502 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.479572 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:05:44 crc kubenswrapper[4696]: I1202 23:05:44.490843 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:05:45 crc kubenswrapper[4696]: I1202 23:05:45.107794 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:05:45 crc kubenswrapper[4696]: I1202 23:05:45.117773 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:05:47 crc kubenswrapper[4696]: I1202 23:05:47.134786 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerStarted","Data":"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e"} Dec 02 23:05:47 crc kubenswrapper[4696]: I1202 23:05:47.159067 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pwbvq" podStartSLOduration=2.757176463 podStartE2EDuration="8.159043286s" podCreationTimestamp="2025-12-02 23:05:39 +0000 UTC" firstStartedPulling="2025-12-02 23:05:41.033793547 +0000 UTC m=+1403.914473548" lastFinishedPulling="2025-12-02 23:05:46.43566037 +0000 UTC m=+1409.316340371" observedRunningTime="2025-12-02 23:05:47.158342216 +0000 UTC m=+1410.039022237" watchObservedRunningTime="2025-12-02 23:05:47.159043286 +0000 UTC m=+1410.039723287" Dec 02 23:05:49 crc kubenswrapper[4696]: I1202 23:05:49.666354 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:49 crc kubenswrapper[4696]: I1202 23:05:49.667048 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:50 crc kubenswrapper[4696]: I1202 23:05:50.747888 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pwbvq" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="registry-server" probeResult="failure" output=< Dec 02 23:05:50 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 23:05:50 crc kubenswrapper[4696]: > Dec 02 23:05:52 crc kubenswrapper[4696]: I1202 23:05:52.973552 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:05:52 crc kubenswrapper[4696]: I1202 23:05:52.973659 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:05:53 crc kubenswrapper[4696]: I1202 23:05:53.826456 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:05:55 crc kubenswrapper[4696]: I1202 23:05:55.179829 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:05:58 crc kubenswrapper[4696]: I1202 23:05:58.805628 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="rabbitmq" containerID="cri-o://3fe2d4b6bb592a2ba71ca2794fe2992b29c9d86d77463744428a5f27988b2ede" gracePeriod=604796 Dec 02 23:05:59 crc kubenswrapper[4696]: I1202 23:05:59.731293 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:59 crc kubenswrapper[4696]: I1202 23:05:59.792579 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:05:59 crc kubenswrapper[4696]: I1202 23:05:59.989381 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:06:00 crc kubenswrapper[4696]: I1202 23:06:00.006831 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="rabbitmq" containerID="cri-o://0134045e0742110b4505830045b5e3aaf129f3b5a4702f5058b74a7956add855" gracePeriod=604796 Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.355624 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pwbvq" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="registry-server" containerID="cri-o://e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e" gracePeriod=2 Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.853397 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.942766 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities\") pod \"5e798188-37b3-4546-b39c-2f6119b56a50\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.943055 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content\") pod \"5e798188-37b3-4546-b39c-2f6119b56a50\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.943128 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kcff\" (UniqueName: \"kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff\") pod \"5e798188-37b3-4546-b39c-2f6119b56a50\" (UID: \"5e798188-37b3-4546-b39c-2f6119b56a50\") " Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.943566 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities" (OuterVolumeSpecName: "utilities") pod "5e798188-37b3-4546-b39c-2f6119b56a50" (UID: "5e798188-37b3-4546-b39c-2f6119b56a50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:01 crc kubenswrapper[4696]: I1202 23:06:01.950989 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff" (OuterVolumeSpecName: "kube-api-access-6kcff") pod "5e798188-37b3-4546-b39c-2f6119b56a50" (UID: "5e798188-37b3-4546-b39c-2f6119b56a50"). InnerVolumeSpecName "kube-api-access-6kcff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.045426 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kcff\" (UniqueName: \"kubernetes.io/projected/5e798188-37b3-4546-b39c-2f6119b56a50-kube-api-access-6kcff\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.045482 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.062229 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e798188-37b3-4546-b39c-2f6119b56a50" (UID: "5e798188-37b3-4546-b39c-2f6119b56a50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.147636 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e798188-37b3-4546-b39c-2f6119b56a50-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.369829 4696 generic.go:334] "Generic (PLEG): container finished" podID="5e798188-37b3-4546-b39c-2f6119b56a50" containerID="e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e" exitCode=0 Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.369870 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerDied","Data":"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e"} Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.369925 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwbvq" event={"ID":"5e798188-37b3-4546-b39c-2f6119b56a50","Type":"ContainerDied","Data":"66258bb8da24f990ed7cdf6fb1b1f94d6644c15dd8b2753e41fc5431f0fa4588"} Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.369950 4696 scope.go:117] "RemoveContainer" containerID="e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.369891 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwbvq" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.406040 4696 scope.go:117] "RemoveContainer" containerID="6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.420450 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.435081 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pwbvq"] Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.448146 4696 scope.go:117] "RemoveContainer" containerID="1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.493956 4696 scope.go:117] "RemoveContainer" containerID="e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e" Dec 02 23:06:02 crc kubenswrapper[4696]: E1202 23:06:02.494989 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e\": container with ID starting with e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e not found: ID does not exist" containerID="e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.495500 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e"} err="failed to get container status \"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e\": rpc error: code = NotFound desc = could not find container \"e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e\": container with ID starting with e0b1273f89bdb1f8bb35da1426454cf5151dd64612f3833b17ba93b1f652676e not found: ID does not exist" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.495578 4696 scope.go:117] "RemoveContainer" containerID="6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a" Dec 02 23:06:02 crc kubenswrapper[4696]: E1202 23:06:02.496227 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a\": container with ID starting with 6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a not found: ID does not exist" containerID="6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.496320 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a"} err="failed to get container status \"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a\": rpc error: code = NotFound desc = could not find container \"6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a\": container with ID starting with 6b6d91195314c39ca6770017d2c0c68e530675d852c6213a74689d76dc01c05a not found: ID does not exist" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.496346 4696 scope.go:117] "RemoveContainer" containerID="1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd" Dec 02 23:06:02 crc kubenswrapper[4696]: E1202 23:06:02.496896 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd\": container with ID starting with 1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd not found: ID does not exist" containerID="1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd" Dec 02 23:06:02 crc kubenswrapper[4696]: I1202 23:06:02.496934 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd"} err="failed to get container status \"1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd\": rpc error: code = NotFound desc = could not find container \"1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd\": container with ID starting with 1f24a157dc163c32586cdd6432e697ce2cbfdad1c674ad45fdaa885ad8d011bd not found: ID does not exist" Dec 02 23:06:03 crc kubenswrapper[4696]: I1202 23:06:03.452964 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" path="/var/lib/kubelet/pods/5e798188-37b3-4546-b39c-2f6119b56a50/volumes" Dec 02 23:06:03 crc kubenswrapper[4696]: I1202 23:06:03.663238 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Dec 02 23:06:04 crc kubenswrapper[4696]: I1202 23:06:04.093522 4696 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.414778 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerID="3fe2d4b6bb592a2ba71ca2794fe2992b29c9d86d77463744428a5f27988b2ede" exitCode=0 Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.414855 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerDied","Data":"3fe2d4b6bb592a2ba71ca2794fe2992b29c9d86d77463744428a5f27988b2ede"} Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.416002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0","Type":"ContainerDied","Data":"2b76c58fbc961bfd186687cd4e13818ca91c224d9b75f3a495116953a54b9ffc"} Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.416050 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b76c58fbc961bfd186687cd4e13818ca91c224d9b75f3a495116953a54b9ffc" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.445724 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.537199 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.537473 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.537627 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.537796 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.537935 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538112 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538210 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538315 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538375 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538474 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.538546 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vn9k\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k\") pod \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\" (UID: \"aa9b93b0-4131-4a4b-a1a8-27ccf68716c0\") " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.539202 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.540026 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.540877 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.571855 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.572007 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info" (OuterVolumeSpecName: "pod-info") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.572070 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k" (OuterVolumeSpecName: "kube-api-access-5vn9k") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "kube-api-access-5vn9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.572132 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.572186 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.588468 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data" (OuterVolumeSpecName: "config-data") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.641974 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642017 4696 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642032 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642044 4696 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642058 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642070 4696 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642086 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vn9k\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-kube-api-access-5vn9k\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642099 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.642109 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.658939 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf" (OuterVolumeSpecName: "server-conf") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.674261 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.716082 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" (UID: "aa9b93b0-4131-4a4b-a1a8-27ccf68716c0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.744469 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.744973 4696 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:05 crc kubenswrapper[4696]: I1202 23:06:05.744987 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.439431 4696 generic.go:334] "Generic (PLEG): container finished" podID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerID="0134045e0742110b4505830045b5e3aaf129f3b5a4702f5058b74a7956add855" exitCode=0 Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.439575 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.439907 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerDied","Data":"0134045e0742110b4505830045b5e3aaf129f3b5a4702f5058b74a7956add855"} Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.512165 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.519259 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.551710 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:06:06 crc kubenswrapper[4696]: E1202 23:06:06.552237 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="registry-server" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552253 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="registry-server" Dec 02 23:06:06 crc kubenswrapper[4696]: E1202 23:06:06.552267 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="extract-utilities" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552273 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="extract-utilities" Dec 02 23:06:06 crc kubenswrapper[4696]: E1202 23:06:06.552284 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="setup-container" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552290 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="setup-container" Dec 02 23:06:06 crc kubenswrapper[4696]: E1202 23:06:06.552311 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="extract-content" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552317 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="extract-content" Dec 02 23:06:06 crc kubenswrapper[4696]: E1202 23:06:06.552348 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="rabbitmq" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552356 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="rabbitmq" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552546 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" containerName="rabbitmq" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.552569 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e798188-37b3-4546-b39c-2f6119b56a50" containerName="registry-server" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.554186 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.556981 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.557409 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.557727 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.558052 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.558632 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nd7wm" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.558890 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.559199 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.573683 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.666887 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667062 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fbc07453-3ac7-469b-ab0e-23ca695250e6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667192 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667229 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fbc07453-3ac7-469b-ab0e-23ca695250e6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667274 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667298 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667493 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fll\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-kube-api-access-76fll\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667596 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667635 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.667666 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.691652 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.774960 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775075 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fbc07453-3ac7-469b-ab0e-23ca695250e6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775187 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775226 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775284 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76fll\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-kube-api-access-76fll\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775326 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775358 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775391 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775452 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775631 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.775720 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fbc07453-3ac7-469b-ab0e-23ca695250e6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.776848 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.777329 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.777981 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-config-data\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.782905 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.783132 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.783463 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fbc07453-3ac7-469b-ab0e-23ca695250e6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.783759 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.789894 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fbc07453-3ac7-469b-ab0e-23ca695250e6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.793422 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fbc07453-3ac7-469b-ab0e-23ca695250e6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.808298 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.817652 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fll\" (UniqueName: \"kubernetes.io/projected/fbc07453-3ac7-469b-ab0e-23ca695250e6-kube-api-access-76fll\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.865659 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"fbc07453-3ac7-469b-ab0e-23ca695250e6\") " pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.876979 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877777 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877848 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877976 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.877997 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878016 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878065 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878172 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878303 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878593 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24wt8\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8\") pod \"aa29c81c-0a87-47f5-be45-8a0e5b083758\" (UID: \"aa29c81c-0a87-47f5-be45-8a0e5b083758\") " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878844 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.878940 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.879426 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.879462 4696 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.879473 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.881958 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.888478 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.888588 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.889433 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8" (OuterVolumeSpecName: "kube-api-access-24wt8") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "kube-api-access-24wt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.895058 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info" (OuterVolumeSpecName: "pod-info") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.931375 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data" (OuterVolumeSpecName: "config-data") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.948989 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf" (OuterVolumeSpecName: "server-conf") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.994699 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999603 4696 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aa29c81c-0a87-47f5-be45-8a0e5b083758-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999692 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999721 4696 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999732 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999760 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24wt8\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-kube-api-access-24wt8\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999772 4696 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aa29c81c-0a87-47f5-be45-8a0e5b083758-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:06 crc kubenswrapper[4696]: I1202 23:06:06.999789 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa29c81c-0a87-47f5-be45-8a0e5b083758-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.037727 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.088298 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aa29c81c-0a87-47f5-be45-8a0e5b083758" (UID: "aa29c81c-0a87-47f5-be45-8a0e5b083758"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.101899 4696 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aa29c81c-0a87-47f5-be45-8a0e5b083758-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.101955 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.450278 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa9b93b0-4131-4a4b-a1a8-27ccf68716c0" path="/var/lib/kubelet/pods/aa9b93b0-4131-4a4b-a1a8-27ccf68716c0/volumes" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.471401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aa29c81c-0a87-47f5-be45-8a0e5b083758","Type":"ContainerDied","Data":"ab48093ee8cde2ab7545077bde8edcbe66f76ab1abedef263fe9077cf09f066d"} Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.471880 4696 scope.go:117] "RemoveContainer" containerID="0134045e0742110b4505830045b5e3aaf129f3b5a4702f5058b74a7956add855" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.472172 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.511156 4696 scope.go:117] "RemoveContainer" containerID="72dadaa892b741f7a71d45cdf9cad76ee8227c1f1237e87e17a38b7792ae3aa3" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.513143 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.544851 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.570410 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.582346 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:06:07 crc kubenswrapper[4696]: E1202 23:06:07.583063 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="setup-container" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.583084 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="setup-container" Dec 02 23:06:07 crc kubenswrapper[4696]: E1202 23:06:07.583102 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="rabbitmq" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.583112 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="rabbitmq" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.583417 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" containerName="rabbitmq" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.585333 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.594277 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.630643 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.630916 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.631073 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.631267 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wr8xb" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.631391 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.631539 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.653583 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.701565 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.703593 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.711680 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735293 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735369 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735435 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735542 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735588 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735610 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735639 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735659 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735682 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fcde5666-44ba-4867-a0ed-afb36ecfafc9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735722 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fcde5666-44ba-4867-a0ed-afb36ecfafc9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.735768 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm9ck\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-kube-api-access-bm9ck\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.755511 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.840477 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fcde5666-44ba-4867-a0ed-afb36ecfafc9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.840778 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm9ck\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-kube-api-access-bm9ck\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.840871 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.840951 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841082 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841156 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841229 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841298 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841406 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfx6\" (UniqueName: \"kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841486 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841556 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841636 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841702 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841781 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841861 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841931 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.841995 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fcde5666-44ba-4867-a0ed-afb36ecfafc9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.842064 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.842991 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.843059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.843419 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.842022 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fcde5666-44ba-4867-a0ed-afb36ecfafc9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.844256 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.849200 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fcde5666-44ba-4867-a0ed-afb36ecfafc9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.854169 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.866066 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fcde5666-44ba-4867-a0ed-afb36ecfafc9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.867612 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.868820 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm9ck\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-kube-api-access-bm9ck\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.878294 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fcde5666-44ba-4867-a0ed-afb36ecfafc9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.943969 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.944708 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.944876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.944998 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.945068 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.945174 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfx6\" (UniqueName: \"kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.945241 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.946118 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.946328 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.954587 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fcde5666-44ba-4867-a0ed-afb36ecfafc9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.955963 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.971971 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.976465 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.979558 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:07 crc kubenswrapper[4696]: I1202 23:06:07.984729 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfx6\" (UniqueName: \"kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6\") pod \"dnsmasq-dns-d558885bc-glxg9\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.024151 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.061545 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.449380 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.493124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fbc07453-3ac7-469b-ab0e-23ca695250e6","Type":"ContainerStarted","Data":"422300724ab5a8de64c7b04e4767bc8b0ce589d71441a4329dc34dd8da5f138d"} Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.497248 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-glxg9" event={"ID":"3ec557d2-975c-4f5f-8654-8521004c0d98","Type":"ContainerStarted","Data":"42e1efc4ca397a52d9975c2f9dfa3fe2586febe3def673963800aed168e1b7d2"} Dec 02 23:06:08 crc kubenswrapper[4696]: I1202 23:06:08.551881 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:06:08 crc kubenswrapper[4696]: W1202 23:06:08.555944 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcde5666_44ba_4867_a0ed_afb36ecfafc9.slice/crio-b8a0abe0598ba54ccc76e1892e12b5ad0613845c31608077f2b2472feb053b33 WatchSource:0}: Error finding container b8a0abe0598ba54ccc76e1892e12b5ad0613845c31608077f2b2472feb053b33: Status 404 returned error can't find the container with id b8a0abe0598ba54ccc76e1892e12b5ad0613845c31608077f2b2472feb053b33 Dec 02 23:06:09 crc kubenswrapper[4696]: I1202 23:06:09.448703 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa29c81c-0a87-47f5-be45-8a0e5b083758" path="/var/lib/kubelet/pods/aa29c81c-0a87-47f5-be45-8a0e5b083758/volumes" Dec 02 23:06:09 crc kubenswrapper[4696]: I1202 23:06:09.518326 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fcde5666-44ba-4867-a0ed-afb36ecfafc9","Type":"ContainerStarted","Data":"b8a0abe0598ba54ccc76e1892e12b5ad0613845c31608077f2b2472feb053b33"} Dec 02 23:06:09 crc kubenswrapper[4696]: I1202 23:06:09.520677 4696 generic.go:334] "Generic (PLEG): container finished" podID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerID="5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f" exitCode=0 Dec 02 23:06:09 crc kubenswrapper[4696]: I1202 23:06:09.520716 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-glxg9" event={"ID":"3ec557d2-975c-4f5f-8654-8521004c0d98","Type":"ContainerDied","Data":"5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f"} Dec 02 23:06:10 crc kubenswrapper[4696]: I1202 23:06:10.534806 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fbc07453-3ac7-469b-ab0e-23ca695250e6","Type":"ContainerStarted","Data":"8cb59ad371e39ce048221a8a760f1382c026f4d73ec453bc6dbf945235773271"} Dec 02 23:06:10 crc kubenswrapper[4696]: I1202 23:06:10.536825 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-glxg9" event={"ID":"3ec557d2-975c-4f5f-8654-8521004c0d98","Type":"ContainerStarted","Data":"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786"} Dec 02 23:06:10 crc kubenswrapper[4696]: I1202 23:06:10.537145 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:10 crc kubenswrapper[4696]: I1202 23:06:10.607526 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-glxg9" podStartSLOduration=3.607492642 podStartE2EDuration="3.607492642s" podCreationTimestamp="2025-12-02 23:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:06:10.592696526 +0000 UTC m=+1433.473376527" watchObservedRunningTime="2025-12-02 23:06:10.607492642 +0000 UTC m=+1433.488172643" Dec 02 23:06:11 crc kubenswrapper[4696]: I1202 23:06:11.554416 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fcde5666-44ba-4867-a0ed-afb36ecfafc9","Type":"ContainerStarted","Data":"7f480fc1cef7e934d557c1a9fe0cb794b44a77773d14f3a27898cafec36e2bf3"} Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.064034 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.169492 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.169849 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="dnsmasq-dns" containerID="cri-o://513fe9cb0db20d1e0524d1e76bbec17f611f4f100bad06222a375baccd4a89a9" gracePeriod=10 Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.335010 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-2zjwn"] Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.338771 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.366161 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-2zjwn"] Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.511625 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ngml\" (UniqueName: \"kubernetes.io/projected/2300da34-d1de-4f62-a360-4d9cb16d48b7-kube-api-access-4ngml\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.511708 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.511728 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.511912 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-config\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.512153 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.512472 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.512601 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.617811 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.617880 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.618011 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ngml\" (UniqueName: \"kubernetes.io/projected/2300da34-d1de-4f62-a360-4d9cb16d48b7-kube-api-access-4ngml\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.618104 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.618127 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.618153 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-config\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.618229 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.620846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.620866 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.622617 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.624808 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.637867 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-config\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.638208 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2300da34-d1de-4f62-a360-4d9cb16d48b7-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.644938 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ngml\" (UniqueName: \"kubernetes.io/projected/2300da34-d1de-4f62-a360-4d9cb16d48b7-kube-api-access-4ngml\") pod \"dnsmasq-dns-6b6dc74c5-2zjwn\" (UID: \"2300da34-d1de-4f62-a360-4d9cb16d48b7\") " pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.659860 4696 generic.go:334] "Generic (PLEG): container finished" podID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerID="513fe9cb0db20d1e0524d1e76bbec17f611f4f100bad06222a375baccd4a89a9" exitCode=0 Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.659928 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" event={"ID":"5b23e912-6f82-4956-89d0-8074d9dbb121","Type":"ContainerDied","Data":"513fe9cb0db20d1e0524d1e76bbec17f611f4f100bad06222a375baccd4a89a9"} Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.659958 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" event={"ID":"5b23e912-6f82-4956-89d0-8074d9dbb121","Type":"ContainerDied","Data":"22f6cb954bad252ad12ddca2859c5959d5f9a0994329c9b930304c9ba05dfd0c"} Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.660056 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f6cb954bad252ad12ddca2859c5959d5f9a0994329c9b930304c9ba05dfd0c" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.670846 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.803095 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.929382 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmm4h\" (UniqueName: \"kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.929557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.929783 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.929979 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.930023 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.930080 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb\") pod \"5b23e912-6f82-4956-89d0-8074d9dbb121\" (UID: \"5b23e912-6f82-4956-89d0-8074d9dbb121\") " Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.936689 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h" (OuterVolumeSpecName: "kube-api-access-xmm4h") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "kube-api-access-xmm4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.991292 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.996104 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:18 crc kubenswrapper[4696]: I1202 23:06:18.999617 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config" (OuterVolumeSpecName: "config") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.004277 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.008469 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b23e912-6f82-4956-89d0-8074d9dbb121" (UID: "5b23e912-6f82-4956-89d0-8074d9dbb121"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.032989 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.033300 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.033386 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmm4h\" (UniqueName: \"kubernetes.io/projected/5b23e912-6f82-4956-89d0-8074d9dbb121-kube-api-access-xmm4h\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.033470 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.033538 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.033616 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b23e912-6f82-4956-89d0-8074d9dbb121-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.154125 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-2zjwn"] Dec 02 23:06:19 crc kubenswrapper[4696]: W1202 23:06:19.158605 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2300da34_d1de_4f62_a360_4d9cb16d48b7.slice/crio-7d0598f169b23d5be96d82d0702b74df9c6d16248ab1371d7164dae77072d8fc WatchSource:0}: Error finding container 7d0598f169b23d5be96d82d0702b74df9c6d16248ab1371d7164dae77072d8fc: Status 404 returned error can't find the container with id 7d0598f169b23d5be96d82d0702b74df9c6d16248ab1371d7164dae77072d8fc Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.673909 4696 generic.go:334] "Generic (PLEG): container finished" podID="2300da34-d1de-4f62-a360-4d9cb16d48b7" containerID="ba36e35cfbe67766cb6ee9649bf16db441ac786daa17e13586d8084ae087a070" exitCode=0 Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.674025 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" event={"ID":"2300da34-d1de-4f62-a360-4d9cb16d48b7","Type":"ContainerDied","Data":"ba36e35cfbe67766cb6ee9649bf16db441ac786daa17e13586d8084ae087a070"} Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.674600 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vsrpt" Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.674617 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" event={"ID":"2300da34-d1de-4f62-a360-4d9cb16d48b7","Type":"ContainerStarted","Data":"7d0598f169b23d5be96d82d0702b74df9c6d16248ab1371d7164dae77072d8fc"} Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.738504 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:06:19 crc kubenswrapper[4696]: I1202 23:06:19.749040 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vsrpt"] Dec 02 23:06:20 crc kubenswrapper[4696]: I1202 23:06:20.688562 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" event={"ID":"2300da34-d1de-4f62-a360-4d9cb16d48b7","Type":"ContainerStarted","Data":"43bc7cb3d0a4e701dd510e3e265d07396b9c95e822e104c4be3096b32a84c635"} Dec 02 23:06:20 crc kubenswrapper[4696]: I1202 23:06:20.689085 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:20 crc kubenswrapper[4696]: I1202 23:06:20.724545 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" podStartSLOduration=2.724521702 podStartE2EDuration="2.724521702s" podCreationTimestamp="2025-12-02 23:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:06:20.718946615 +0000 UTC m=+1443.599626626" watchObservedRunningTime="2025-12-02 23:06:20.724521702 +0000 UTC m=+1443.605201703" Dec 02 23:06:21 crc kubenswrapper[4696]: I1202 23:06:21.457734 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" path="/var/lib/kubelet/pods/5b23e912-6f82-4956-89d0-8074d9dbb121/volumes" Dec 02 23:06:22 crc kubenswrapper[4696]: I1202 23:06:22.973690 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:06:22 crc kubenswrapper[4696]: I1202 23:06:22.973854 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:06:28 crc kubenswrapper[4696]: I1202 23:06:28.673102 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6dc74c5-2zjwn" Dec 02 23:06:28 crc kubenswrapper[4696]: I1202 23:06:28.743024 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:28 crc kubenswrapper[4696]: I1202 23:06:28.743571 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-glxg9" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="dnsmasq-dns" containerID="cri-o://a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786" gracePeriod=10 Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.272894 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.415292 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.415367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.415398 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.415480 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.415613 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvfx6\" (UniqueName: \"kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.416470 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.416530 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb\") pod \"3ec557d2-975c-4f5f-8654-8521004c0d98\" (UID: \"3ec557d2-975c-4f5f-8654-8521004c0d98\") " Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.428860 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6" (OuterVolumeSpecName: "kube-api-access-qvfx6") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "kube-api-access-qvfx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.491359 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.494187 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.495092 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.498397 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config" (OuterVolumeSpecName: "config") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.501502 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.507481 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ec557d2-975c-4f5f-8654-8521004c0d98" (UID: "3ec557d2-975c-4f5f-8654-8521004c0d98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.519912 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520149 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520216 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520271 4696 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520324 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvfx6\" (UniqueName: \"kubernetes.io/projected/3ec557d2-975c-4f5f-8654-8521004c0d98-kube-api-access-qvfx6\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520379 4696 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.520429 4696 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec557d2-975c-4f5f-8654-8521004c0d98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.832009 4696 generic.go:334] "Generic (PLEG): container finished" podID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerID="a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786" exitCode=0 Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.832064 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-glxg9" event={"ID":"3ec557d2-975c-4f5f-8654-8521004c0d98","Type":"ContainerDied","Data":"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786"} Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.832097 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-glxg9" event={"ID":"3ec557d2-975c-4f5f-8654-8521004c0d98","Type":"ContainerDied","Data":"42e1efc4ca397a52d9975c2f9dfa3fe2586febe3def673963800aed168e1b7d2"} Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.832095 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-glxg9" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.832116 4696 scope.go:117] "RemoveContainer" containerID="a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.878401 4696 scope.go:117] "RemoveContainer" containerID="5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.882056 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.897644 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-glxg9"] Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.915689 4696 scope.go:117] "RemoveContainer" containerID="a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786" Dec 02 23:06:29 crc kubenswrapper[4696]: E1202 23:06:29.916359 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786\": container with ID starting with a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786 not found: ID does not exist" containerID="a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.916436 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786"} err="failed to get container status \"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786\": rpc error: code = NotFound desc = could not find container \"a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786\": container with ID starting with a294037f1a45267827bc234a59e00f8d155bd419addc04df177bffef3e1ed786 not found: ID does not exist" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.916516 4696 scope.go:117] "RemoveContainer" containerID="5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f" Dec 02 23:06:29 crc kubenswrapper[4696]: E1202 23:06:29.917233 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f\": container with ID starting with 5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f not found: ID does not exist" containerID="5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f" Dec 02 23:06:29 crc kubenswrapper[4696]: I1202 23:06:29.917307 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f"} err="failed to get container status \"5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f\": rpc error: code = NotFound desc = could not find container \"5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f\": container with ID starting with 5885723f9d155818e78643c655827db0131ceda88bf1b61c3df3782c682b029f not found: ID does not exist" Dec 02 23:06:31 crc kubenswrapper[4696]: I1202 23:06:31.449842 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" path="/var/lib/kubelet/pods/3ec557d2-975c-4f5f-8654-8521004c0d98/volumes" Dec 02 23:06:33 crc kubenswrapper[4696]: I1202 23:06:33.706912 4696 scope.go:117] "RemoveContainer" containerID="b1b6c8104bd2c4548eaf0e047605a944629556d000273f410634d938caa54ca8" Dec 02 23:06:43 crc kubenswrapper[4696]: I1202 23:06:43.052596 4696 generic.go:334] "Generic (PLEG): container finished" podID="fbc07453-3ac7-469b-ab0e-23ca695250e6" containerID="8cb59ad371e39ce048221a8a760f1382c026f4d73ec453bc6dbf945235773271" exitCode=0 Dec 02 23:06:43 crc kubenswrapper[4696]: I1202 23:06:43.052732 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fbc07453-3ac7-469b-ab0e-23ca695250e6","Type":"ContainerDied","Data":"8cb59ad371e39ce048221a8a760f1382c026f4d73ec453bc6dbf945235773271"} Dec 02 23:06:44 crc kubenswrapper[4696]: I1202 23:06:44.078313 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fbc07453-3ac7-469b-ab0e-23ca695250e6","Type":"ContainerStarted","Data":"724bb20942786bf96c2560d3fd5585a989c93aa9b17c0c91d3a0c707c19fbaae"} Dec 02 23:06:44 crc kubenswrapper[4696]: I1202 23:06:44.079720 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 23:06:44 crc kubenswrapper[4696]: I1202 23:06:44.082821 4696 generic.go:334] "Generic (PLEG): container finished" podID="fcde5666-44ba-4867-a0ed-afb36ecfafc9" containerID="7f480fc1cef7e934d557c1a9fe0cb794b44a77773d14f3a27898cafec36e2bf3" exitCode=0 Dec 02 23:06:44 crc kubenswrapper[4696]: I1202 23:06:44.082895 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fcde5666-44ba-4867-a0ed-afb36ecfafc9","Type":"ContainerDied","Data":"7f480fc1cef7e934d557c1a9fe0cb794b44a77773d14f3a27898cafec36e2bf3"} Dec 02 23:06:44 crc kubenswrapper[4696]: I1202 23:06:44.125304 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.125275824 podStartE2EDuration="38.125275824s" podCreationTimestamp="2025-12-02 23:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:06:44.121846208 +0000 UTC m=+1467.002526249" watchObservedRunningTime="2025-12-02 23:06:44.125275824 +0000 UTC m=+1467.005955815" Dec 02 23:06:45 crc kubenswrapper[4696]: I1202 23:06:45.096498 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fcde5666-44ba-4867-a0ed-afb36ecfafc9","Type":"ContainerStarted","Data":"0c100bcaa05ce6c1a6cadf7c0a7af43bf6e0e7ef6cd569743e40e96278d93999"} Dec 02 23:06:45 crc kubenswrapper[4696]: I1202 23:06:45.097205 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:45 crc kubenswrapper[4696]: I1202 23:06:45.135634 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.13560448 podStartE2EDuration="38.13560448s" podCreationTimestamp="2025-12-02 23:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:06:45.122582403 +0000 UTC m=+1468.003262394" watchObservedRunningTime="2025-12-02 23:06:45.13560448 +0000 UTC m=+1468.016284481" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.037197 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4"] Dec 02 23:06:47 crc kubenswrapper[4696]: E1202 23:06:47.038304 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="init" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038323 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="init" Dec 02 23:06:47 crc kubenswrapper[4696]: E1202 23:06:47.038368 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038376 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: E1202 23:06:47.038393 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="init" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038401 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="init" Dec 02 23:06:47 crc kubenswrapper[4696]: E1202 23:06:47.038434 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038441 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038674 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec557d2-975c-4f5f-8654-8521004c0d98" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.038710 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b23e912-6f82-4956-89d0-8074d9dbb121" containerName="dnsmasq-dns" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.039589 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.044724 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.044966 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.047488 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.059067 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.065062 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4"] Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.149991 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.150136 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xcqv\" (UniqueName: \"kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.150242 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.150350 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.253009 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.253085 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.253141 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xcqv\" (UniqueName: \"kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.253228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.268948 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.268962 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.269239 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.273614 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xcqv\" (UniqueName: \"kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.366880 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:06:47 crc kubenswrapper[4696]: I1202 23:06:47.966430 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4"] Dec 02 23:06:48 crc kubenswrapper[4696]: I1202 23:06:48.138193 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" event={"ID":"d6b40a87-ecaf-4f50-a3e0-04235ce0f029","Type":"ContainerStarted","Data":"0c1354723996ea78f8b53a07f1a85e158e5d1ba3549f0811cc4cdba57fb6ab1a"} Dec 02 23:06:52 crc kubenswrapper[4696]: I1202 23:06:52.974764 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:06:52 crc kubenswrapper[4696]: I1202 23:06:52.975672 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:06:52 crc kubenswrapper[4696]: I1202 23:06:52.975760 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:06:52 crc kubenswrapper[4696]: I1202 23:06:52.976872 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:06:52 crc kubenswrapper[4696]: I1202 23:06:52.976953 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9" gracePeriod=600 Dec 02 23:06:53 crc kubenswrapper[4696]: I1202 23:06:53.209555 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9" exitCode=0 Dec 02 23:06:53 crc kubenswrapper[4696]: I1202 23:06:53.209657 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9"} Dec 02 23:06:53 crc kubenswrapper[4696]: I1202 23:06:53.209777 4696 scope.go:117] "RemoveContainer" containerID="3ad056c5d440f52aaf3e529aaaa0adb5466b2661f6219a6364c0d70692a5e85b" Dec 02 23:06:57 crc kubenswrapper[4696]: I1202 23:06:57.000060 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 23:06:58 crc kubenswrapper[4696]: I1202 23:06:58.027066 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:06:58 crc kubenswrapper[4696]: I1202 23:06:58.334467 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df"} Dec 02 23:06:58 crc kubenswrapper[4696]: I1202 23:06:58.344290 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" event={"ID":"d6b40a87-ecaf-4f50-a3e0-04235ce0f029","Type":"ContainerStarted","Data":"0e0331656b5918e7d08fdd96d9702ce52f0d59abe0fe9dc4c6dcc82e889d2598"} Dec 02 23:06:58 crc kubenswrapper[4696]: I1202 23:06:58.392004 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" podStartSLOduration=1.483340467 podStartE2EDuration="11.391979324s" podCreationTimestamp="2025-12-02 23:06:47 +0000 UTC" firstStartedPulling="2025-12-02 23:06:47.986609462 +0000 UTC m=+1470.867289473" lastFinishedPulling="2025-12-02 23:06:57.895248329 +0000 UTC m=+1480.775928330" observedRunningTime="2025-12-02 23:06:58.384224155 +0000 UTC m=+1481.264904166" watchObservedRunningTime="2025-12-02 23:06:58.391979324 +0000 UTC m=+1481.272659325" Dec 02 23:07:10 crc kubenswrapper[4696]: I1202 23:07:10.490014 4696 generic.go:334] "Generic (PLEG): container finished" podID="d6b40a87-ecaf-4f50-a3e0-04235ce0f029" containerID="0e0331656b5918e7d08fdd96d9702ce52f0d59abe0fe9dc4c6dcc82e889d2598" exitCode=0 Dec 02 23:07:10 crc kubenswrapper[4696]: I1202 23:07:10.490107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" event={"ID":"d6b40a87-ecaf-4f50-a3e0-04235ce0f029","Type":"ContainerDied","Data":"0e0331656b5918e7d08fdd96d9702ce52f0d59abe0fe9dc4c6dcc82e889d2598"} Dec 02 23:07:11 crc kubenswrapper[4696]: I1202 23:07:11.988856 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.063179 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key\") pod \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.063540 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle\") pod \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.063682 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory\") pod \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.063858 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xcqv\" (UniqueName: \"kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv\") pod \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\" (UID: \"d6b40a87-ecaf-4f50-a3e0-04235ce0f029\") " Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.071182 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv" (OuterVolumeSpecName: "kube-api-access-8xcqv") pod "d6b40a87-ecaf-4f50-a3e0-04235ce0f029" (UID: "d6b40a87-ecaf-4f50-a3e0-04235ce0f029"). InnerVolumeSpecName "kube-api-access-8xcqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.080413 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d6b40a87-ecaf-4f50-a3e0-04235ce0f029" (UID: "d6b40a87-ecaf-4f50-a3e0-04235ce0f029"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.097191 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6b40a87-ecaf-4f50-a3e0-04235ce0f029" (UID: "d6b40a87-ecaf-4f50-a3e0-04235ce0f029"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.097431 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory" (OuterVolumeSpecName: "inventory") pod "d6b40a87-ecaf-4f50-a3e0-04235ce0f029" (UID: "d6b40a87-ecaf-4f50-a3e0-04235ce0f029"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.167182 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xcqv\" (UniqueName: \"kubernetes.io/projected/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-kube-api-access-8xcqv\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.167225 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.167245 4696 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.167258 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6b40a87-ecaf-4f50-a3e0-04235ce0f029-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.516098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" event={"ID":"d6b40a87-ecaf-4f50-a3e0-04235ce0f029","Type":"ContainerDied","Data":"0c1354723996ea78f8b53a07f1a85e158e5d1ba3549f0811cc4cdba57fb6ab1a"} Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.516159 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c1354723996ea78f8b53a07f1a85e158e5d1ba3549f0811cc4cdba57fb6ab1a" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.516245 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.618043 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb"] Dec 02 23:07:12 crc kubenswrapper[4696]: E1202 23:07:12.618723 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b40a87-ecaf-4f50-a3e0-04235ce0f029" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.618768 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b40a87-ecaf-4f50-a3e0-04235ce0f029" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.619103 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b40a87-ecaf-4f50-a3e0-04235ce0f029" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.620124 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.624237 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.624244 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.625153 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.625438 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.632496 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb"] Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.679764 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.679969 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9bpg\" (UniqueName: \"kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.680028 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.781868 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9bpg\" (UniqueName: \"kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.781939 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.781995 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.790222 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.790402 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.810001 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9bpg\" (UniqueName: \"kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fjhhb\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:12 crc kubenswrapper[4696]: I1202 23:07:12.952126 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:13 crc kubenswrapper[4696]: W1202 23:07:13.503688 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba5e6341_d5c7_41b8_adf8_59f3036d3838.slice/crio-09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53 WatchSource:0}: Error finding container 09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53: Status 404 returned error can't find the container with id 09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53 Dec 02 23:07:13 crc kubenswrapper[4696]: I1202 23:07:13.504157 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb"] Dec 02 23:07:13 crc kubenswrapper[4696]: I1202 23:07:13.529716 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" event={"ID":"ba5e6341-d5c7-41b8-adf8-59f3036d3838","Type":"ContainerStarted","Data":"09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53"} Dec 02 23:07:14 crc kubenswrapper[4696]: I1202 23:07:14.574507 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" event={"ID":"ba5e6341-d5c7-41b8-adf8-59f3036d3838","Type":"ContainerStarted","Data":"c659041f1f9f0317f14b79757814d31ca28a9bf371e5a0a27c3b152f2553789d"} Dec 02 23:07:14 crc kubenswrapper[4696]: I1202 23:07:14.649172 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" podStartSLOduration=2.209614619 podStartE2EDuration="2.649146625s" podCreationTimestamp="2025-12-02 23:07:12 +0000 UTC" firstStartedPulling="2025-12-02 23:07:13.506579099 +0000 UTC m=+1496.387259090" lastFinishedPulling="2025-12-02 23:07:13.946111095 +0000 UTC m=+1496.826791096" observedRunningTime="2025-12-02 23:07:14.613218364 +0000 UTC m=+1497.493898365" watchObservedRunningTime="2025-12-02 23:07:14.649146625 +0000 UTC m=+1497.529826636" Dec 02 23:07:17 crc kubenswrapper[4696]: I1202 23:07:17.615392 4696 generic.go:334] "Generic (PLEG): container finished" podID="ba5e6341-d5c7-41b8-adf8-59f3036d3838" containerID="c659041f1f9f0317f14b79757814d31ca28a9bf371e5a0a27c3b152f2553789d" exitCode=0 Dec 02 23:07:17 crc kubenswrapper[4696]: I1202 23:07:17.616451 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" event={"ID":"ba5e6341-d5c7-41b8-adf8-59f3036d3838","Type":"ContainerDied","Data":"c659041f1f9f0317f14b79757814d31ca28a9bf371e5a0a27c3b152f2553789d"} Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.129482 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.294036 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory\") pod \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.294579 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9bpg\" (UniqueName: \"kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg\") pod \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.294775 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key\") pod \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\" (UID: \"ba5e6341-d5c7-41b8-adf8-59f3036d3838\") " Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.302773 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg" (OuterVolumeSpecName: "kube-api-access-q9bpg") pod "ba5e6341-d5c7-41b8-adf8-59f3036d3838" (UID: "ba5e6341-d5c7-41b8-adf8-59f3036d3838"). InnerVolumeSpecName "kube-api-access-q9bpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.325466 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory" (OuterVolumeSpecName: "inventory") pod "ba5e6341-d5c7-41b8-adf8-59f3036d3838" (UID: "ba5e6341-d5c7-41b8-adf8-59f3036d3838"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.327911 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ba5e6341-d5c7-41b8-adf8-59f3036d3838" (UID: "ba5e6341-d5c7-41b8-adf8-59f3036d3838"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.397240 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9bpg\" (UniqueName: \"kubernetes.io/projected/ba5e6341-d5c7-41b8-adf8-59f3036d3838-kube-api-access-q9bpg\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.397287 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.397298 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba5e6341-d5c7-41b8-adf8-59f3036d3838-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.657015 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" event={"ID":"ba5e6341-d5c7-41b8-adf8-59f3036d3838","Type":"ContainerDied","Data":"09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53"} Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.657088 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d39d6ecaa1c00b55f853438ce2674a641808c50d6395ddf5e37f50246d7c53" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.657190 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fjhhb" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.751998 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht"] Dec 02 23:07:19 crc kubenswrapper[4696]: E1202 23:07:19.752994 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5e6341-d5c7-41b8-adf8-59f3036d3838" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.753033 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5e6341-d5c7-41b8-adf8-59f3036d3838" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.753426 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5e6341-d5c7-41b8-adf8-59f3036d3838" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.755163 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.759262 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.759389 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.759419 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.760868 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.763658 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht"] Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.912096 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.912185 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.912238 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxsw4\" (UniqueName: \"kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:19 crc kubenswrapper[4696]: I1202 23:07:19.912674 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.014971 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.015037 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.015073 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxsw4\" (UniqueName: \"kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.015198 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.022434 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.023724 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.024379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.039963 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxsw4\" (UniqueName: \"kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.081091 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:07:20 crc kubenswrapper[4696]: I1202 23:07:20.687306 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht"] Dec 02 23:07:20 crc kubenswrapper[4696]: W1202 23:07:20.690282 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10db9578_c367_420b_ba4f_93729e4d9483.slice/crio-b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0 WatchSource:0}: Error finding container b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0: Status 404 returned error can't find the container with id b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0 Dec 02 23:07:21 crc kubenswrapper[4696]: I1202 23:07:21.688589 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" event={"ID":"10db9578-c367-420b-ba4f-93729e4d9483","Type":"ContainerStarted","Data":"5ea5bda22a8b717be409a23093863c920c5c28c1e5590eb83df653884f600ffd"} Dec 02 23:07:21 crc kubenswrapper[4696]: I1202 23:07:21.689291 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" event={"ID":"10db9578-c367-420b-ba4f-93729e4d9483","Type":"ContainerStarted","Data":"b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0"} Dec 02 23:07:21 crc kubenswrapper[4696]: I1202 23:07:21.726321 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" podStartSLOduration=2.284473378 podStartE2EDuration="2.726284218s" podCreationTimestamp="2025-12-02 23:07:19 +0000 UTC" firstStartedPulling="2025-12-02 23:07:20.692555264 +0000 UTC m=+1503.573235285" lastFinishedPulling="2025-12-02 23:07:21.134366124 +0000 UTC m=+1504.015046125" observedRunningTime="2025-12-02 23:07:21.711584504 +0000 UTC m=+1504.592264585" watchObservedRunningTime="2025-12-02 23:07:21.726284218 +0000 UTC m=+1504.606964249" Dec 02 23:07:33 crc kubenswrapper[4696]: I1202 23:07:33.811341 4696 scope.go:117] "RemoveContainer" containerID="65ca14fe80d54e327a3edabf25ddf17d55ca0f28a98fbc5f635f51e6c8cd1ed9" Dec 02 23:07:33 crc kubenswrapper[4696]: I1202 23:07:33.870244 4696 scope.go:117] "RemoveContainer" containerID="3fe2d4b6bb592a2ba71ca2794fe2992b29c9d86d77463744428a5f27988b2ede" Dec 02 23:07:33 crc kubenswrapper[4696]: I1202 23:07:33.938240 4696 scope.go:117] "RemoveContainer" containerID="c21c71b6337a1ce2a71c7b36736be88b11a1b7649d078b9bc5a9a24cd00c8afa" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.561126 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.564455 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.575631 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.612264 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.612332 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.612396 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qf9\" (UniqueName: \"kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.715358 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.715444 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.715513 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qf9\" (UniqueName: \"kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.716134 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.716431 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.740366 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qf9\" (UniqueName: \"kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9\") pod \"community-operators-69sqj\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:21 crc kubenswrapper[4696]: I1202 23:08:21.903469 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:22 crc kubenswrapper[4696]: I1202 23:08:22.474630 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:23 crc kubenswrapper[4696]: I1202 23:08:23.488156 4696 generic.go:334] "Generic (PLEG): container finished" podID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerID="3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5" exitCode=0 Dec 02 23:08:23 crc kubenswrapper[4696]: I1202 23:08:23.488599 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerDied","Data":"3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5"} Dec 02 23:08:23 crc kubenswrapper[4696]: I1202 23:08:23.488642 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerStarted","Data":"fa28ef7dd04ff6f0795ca8e26569a41ea5e531f707e47eb6bc3367f1c26d8669"} Dec 02 23:08:24 crc kubenswrapper[4696]: I1202 23:08:24.501599 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerStarted","Data":"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e"} Dec 02 23:08:25 crc kubenswrapper[4696]: I1202 23:08:25.521574 4696 generic.go:334] "Generic (PLEG): container finished" podID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerID="9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e" exitCode=0 Dec 02 23:08:25 crc kubenswrapper[4696]: I1202 23:08:25.521636 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerDied","Data":"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e"} Dec 02 23:08:26 crc kubenswrapper[4696]: I1202 23:08:26.536597 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerStarted","Data":"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388"} Dec 02 23:08:26 crc kubenswrapper[4696]: I1202 23:08:26.563709 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69sqj" podStartSLOduration=3.089044963 podStartE2EDuration="5.563682727s" podCreationTimestamp="2025-12-02 23:08:21 +0000 UTC" firstStartedPulling="2025-12-02 23:08:23.491381153 +0000 UTC m=+1566.372061154" lastFinishedPulling="2025-12-02 23:08:25.966018917 +0000 UTC m=+1568.846698918" observedRunningTime="2025-12-02 23:08:26.557876216 +0000 UTC m=+1569.438556257" watchObservedRunningTime="2025-12-02 23:08:26.563682727 +0000 UTC m=+1569.444362728" Dec 02 23:08:31 crc kubenswrapper[4696]: I1202 23:08:31.904578 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:31 crc kubenswrapper[4696]: I1202 23:08:31.905009 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:31 crc kubenswrapper[4696]: I1202 23:08:31.972133 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:32 crc kubenswrapper[4696]: I1202 23:08:32.697153 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:32 crc kubenswrapper[4696]: I1202 23:08:32.768618 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:34 crc kubenswrapper[4696]: I1202 23:08:34.070487 4696 scope.go:117] "RemoveContainer" containerID="abaf4949223eb7ae80e1cea1e57969b340c4095d0c83f03d0effb1be18e79890" Dec 02 23:08:34 crc kubenswrapper[4696]: I1202 23:08:34.126264 4696 scope.go:117] "RemoveContainer" containerID="4bfb19b97cf856834313da9df5e9b696ff7b6748b5adb44897277ce5b4e4fd93" Dec 02 23:08:34 crc kubenswrapper[4696]: I1202 23:08:34.188791 4696 scope.go:117] "RemoveContainer" containerID="d7c660807a714eebb09c333f9c4eba8c74111d369b051290641bd9d2ad6dc472" Dec 02 23:08:34 crc kubenswrapper[4696]: I1202 23:08:34.654759 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69sqj" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="registry-server" containerID="cri-o://160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388" gracePeriod=2 Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.197166 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.255682 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content\") pod \"6a9e9802-4b41-4aec-a04a-ed54682b1307\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.255779 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qf9\" (UniqueName: \"kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9\") pod \"6a9e9802-4b41-4aec-a04a-ed54682b1307\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.255828 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities\") pod \"6a9e9802-4b41-4aec-a04a-ed54682b1307\" (UID: \"6a9e9802-4b41-4aec-a04a-ed54682b1307\") " Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.256999 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities" (OuterVolumeSpecName: "utilities") pod "6a9e9802-4b41-4aec-a04a-ed54682b1307" (UID: "6a9e9802-4b41-4aec-a04a-ed54682b1307"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.262874 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9" (OuterVolumeSpecName: "kube-api-access-r5qf9") pod "6a9e9802-4b41-4aec-a04a-ed54682b1307" (UID: "6a9e9802-4b41-4aec-a04a-ed54682b1307"). InnerVolumeSpecName "kube-api-access-r5qf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.319521 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9e9802-4b41-4aec-a04a-ed54682b1307" (UID: "6a9e9802-4b41-4aec-a04a-ed54682b1307"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.360256 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.360296 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5qf9\" (UniqueName: \"kubernetes.io/projected/6a9e9802-4b41-4aec-a04a-ed54682b1307-kube-api-access-r5qf9\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.360312 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e9802-4b41-4aec-a04a-ed54682b1307-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.667950 4696 generic.go:334] "Generic (PLEG): container finished" podID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerID="160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388" exitCode=0 Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.668032 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerDied","Data":"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388"} Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.668076 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69sqj" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.668126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69sqj" event={"ID":"6a9e9802-4b41-4aec-a04a-ed54682b1307","Type":"ContainerDied","Data":"fa28ef7dd04ff6f0795ca8e26569a41ea5e531f707e47eb6bc3367f1c26d8669"} Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.668157 4696 scope.go:117] "RemoveContainer" containerID="160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.702920 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.708754 4696 scope.go:117] "RemoveContainer" containerID="9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.715272 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69sqj"] Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.738413 4696 scope.go:117] "RemoveContainer" containerID="3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.794040 4696 scope.go:117] "RemoveContainer" containerID="160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388" Dec 02 23:08:35 crc kubenswrapper[4696]: E1202 23:08:35.794455 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388\": container with ID starting with 160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388 not found: ID does not exist" containerID="160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.794500 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388"} err="failed to get container status \"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388\": rpc error: code = NotFound desc = could not find container \"160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388\": container with ID starting with 160c0b7ed827089ac3347979bb7a1783d512d1effbda610790de63cfb2ee0388 not found: ID does not exist" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.794532 4696 scope.go:117] "RemoveContainer" containerID="9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e" Dec 02 23:08:35 crc kubenswrapper[4696]: E1202 23:08:35.795045 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e\": container with ID starting with 9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e not found: ID does not exist" containerID="9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.795070 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e"} err="failed to get container status \"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e\": rpc error: code = NotFound desc = could not find container \"9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e\": container with ID starting with 9c813bac8cccacf3c80ce9c863583e807b394278e0c2750122984479f2c8595e not found: ID does not exist" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.795085 4696 scope.go:117] "RemoveContainer" containerID="3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5" Dec 02 23:08:35 crc kubenswrapper[4696]: E1202 23:08:35.795330 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5\": container with ID starting with 3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5 not found: ID does not exist" containerID="3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5" Dec 02 23:08:35 crc kubenswrapper[4696]: I1202 23:08:35.795357 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5"} err="failed to get container status \"3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5\": rpc error: code = NotFound desc = could not find container \"3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5\": container with ID starting with 3dc2866b3fd945ad37c47f871ae3bfe476b9fd6e7020faff46b29791ffe580f5 not found: ID does not exist" Dec 02 23:08:37 crc kubenswrapper[4696]: I1202 23:08:37.452471 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" path="/var/lib/kubelet/pods/6a9e9802-4b41-4aec-a04a-ed54682b1307/volumes" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.249037 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:10 crc kubenswrapper[4696]: E1202 23:09:10.250546 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="extract-content" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.250568 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="extract-content" Dec 02 23:09:10 crc kubenswrapper[4696]: E1202 23:09:10.250622 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="registry-server" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.250630 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="registry-server" Dec 02 23:09:10 crc kubenswrapper[4696]: E1202 23:09:10.250647 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="extract-utilities" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.250656 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="extract-utilities" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.250967 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9e9802-4b41-4aec-a04a-ed54682b1307" containerName="registry-server" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.254075 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.259410 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.356416 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78mf\" (UniqueName: \"kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.356505 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.356556 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.459341 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78mf\" (UniqueName: \"kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.459864 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.459905 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.460329 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.460456 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.484688 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78mf\" (UniqueName: \"kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf\") pod \"certified-operators-m7r24\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:10 crc kubenswrapper[4696]: I1202 23:09:10.591264 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:11 crc kubenswrapper[4696]: I1202 23:09:11.123659 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:11 crc kubenswrapper[4696]: I1202 23:09:11.161641 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerStarted","Data":"2aee5266490d88a7c375d9bc8dee2741a6d0e88e11e8ae01f45c0e31ca7ca73e"} Dec 02 23:09:12 crc kubenswrapper[4696]: I1202 23:09:12.195071 4696 generic.go:334] "Generic (PLEG): container finished" podID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerID="d3594e65bdd0ccdd9d25e7dc2a126340ad3c8383049dfc3129919c4b2f4436b8" exitCode=0 Dec 02 23:09:12 crc kubenswrapper[4696]: I1202 23:09:12.195527 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerDied","Data":"d3594e65bdd0ccdd9d25e7dc2a126340ad3c8383049dfc3129919c4b2f4436b8"} Dec 02 23:09:12 crc kubenswrapper[4696]: I1202 23:09:12.210122 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:09:13 crc kubenswrapper[4696]: I1202 23:09:13.210468 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerStarted","Data":"84c52015fd31b437c81535cddc8184cea36367585fd397974fad2816bf56a69f"} Dec 02 23:09:14 crc kubenswrapper[4696]: I1202 23:09:14.228842 4696 generic.go:334] "Generic (PLEG): container finished" podID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerID="84c52015fd31b437c81535cddc8184cea36367585fd397974fad2816bf56a69f" exitCode=0 Dec 02 23:09:14 crc kubenswrapper[4696]: I1202 23:09:14.228997 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerDied","Data":"84c52015fd31b437c81535cddc8184cea36367585fd397974fad2816bf56a69f"} Dec 02 23:09:15 crc kubenswrapper[4696]: I1202 23:09:15.242247 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerStarted","Data":"7d2d52569cbe53f8a460cd185908e062d2042b228e7891be95952b9aaa6b57bc"} Dec 02 23:09:15 crc kubenswrapper[4696]: I1202 23:09:15.264806 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7r24" podStartSLOduration=2.802327188 podStartE2EDuration="5.264783373s" podCreationTimestamp="2025-12-02 23:09:10 +0000 UTC" firstStartedPulling="2025-12-02 23:09:12.206068346 +0000 UTC m=+1615.086748397" lastFinishedPulling="2025-12-02 23:09:14.668524551 +0000 UTC m=+1617.549204582" observedRunningTime="2025-12-02 23:09:15.261035289 +0000 UTC m=+1618.141715290" watchObservedRunningTime="2025-12-02 23:09:15.264783373 +0000 UTC m=+1618.145463364" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.592407 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.593590 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.672848 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.979231 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.981871 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.985612 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.985784 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld86x\" (UniqueName: \"kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.985819 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:20 crc kubenswrapper[4696]: I1202 23:09:20.995451 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.087015 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld86x\" (UniqueName: \"kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.087087 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.087193 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.088434 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.088607 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.112439 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld86x\" (UniqueName: \"kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x\") pod \"redhat-marketplace-8t2zs\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.307127 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.392449 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:21 crc kubenswrapper[4696]: I1202 23:09:21.841129 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:22 crc kubenswrapper[4696]: I1202 23:09:22.321847 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerStarted","Data":"2e577a6a7408859dafb411dd25afe565a53b1f436ff1bb627dc9f40a33ac9227"} Dec 02 23:09:22 crc kubenswrapper[4696]: I1202 23:09:22.973915 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:09:22 crc kubenswrapper[4696]: I1202 23:09:22.975118 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:09:23 crc kubenswrapper[4696]: I1202 23:09:23.337569 4696 generic.go:334] "Generic (PLEG): container finished" podID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerID="1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8" exitCode=0 Dec 02 23:09:23 crc kubenswrapper[4696]: I1202 23:09:23.338143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerDied","Data":"1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8"} Dec 02 23:09:23 crc kubenswrapper[4696]: I1202 23:09:23.775392 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:23 crc kubenswrapper[4696]: I1202 23:09:23.775681 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m7r24" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="registry-server" containerID="cri-o://7d2d52569cbe53f8a460cd185908e062d2042b228e7891be95952b9aaa6b57bc" gracePeriod=2 Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.352104 4696 generic.go:334] "Generic (PLEG): container finished" podID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerID="7d2d52569cbe53f8a460cd185908e062d2042b228e7891be95952b9aaa6b57bc" exitCode=0 Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.352190 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerDied","Data":"7d2d52569cbe53f8a460cd185908e062d2042b228e7891be95952b9aaa6b57bc"} Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.353952 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7r24" event={"ID":"2710e5fa-e995-4fcf-b48d-30dd42225dd0","Type":"ContainerDied","Data":"2aee5266490d88a7c375d9bc8dee2741a6d0e88e11e8ae01f45c0e31ca7ca73e"} Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.353968 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aee5266490d88a7c375d9bc8dee2741a6d0e88e11e8ae01f45c0e31ca7ca73e" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.356768 4696 generic.go:334] "Generic (PLEG): container finished" podID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerID="7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6" exitCode=0 Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.356840 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerDied","Data":"7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6"} Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.386313 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.576752 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78mf\" (UniqueName: \"kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf\") pod \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.577218 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content\") pod \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.577304 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities\") pod \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\" (UID: \"2710e5fa-e995-4fcf-b48d-30dd42225dd0\") " Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.578399 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities" (OuterVolumeSpecName: "utilities") pod "2710e5fa-e995-4fcf-b48d-30dd42225dd0" (UID: "2710e5fa-e995-4fcf-b48d-30dd42225dd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.587551 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf" (OuterVolumeSpecName: "kube-api-access-p78mf") pod "2710e5fa-e995-4fcf-b48d-30dd42225dd0" (UID: "2710e5fa-e995-4fcf-b48d-30dd42225dd0"). InnerVolumeSpecName "kube-api-access-p78mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.631485 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2710e5fa-e995-4fcf-b48d-30dd42225dd0" (UID: "2710e5fa-e995-4fcf-b48d-30dd42225dd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.680177 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.680220 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78mf\" (UniqueName: \"kubernetes.io/projected/2710e5fa-e995-4fcf-b48d-30dd42225dd0-kube-api-access-p78mf\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:24 crc kubenswrapper[4696]: I1202 23:09:24.680234 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2710e5fa-e995-4fcf-b48d-30dd42225dd0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:25 crc kubenswrapper[4696]: I1202 23:09:25.371785 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7r24" Dec 02 23:09:25 crc kubenswrapper[4696]: I1202 23:09:25.373277 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerStarted","Data":"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8"} Dec 02 23:09:25 crc kubenswrapper[4696]: I1202 23:09:25.406899 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8t2zs" podStartSLOduration=3.946392966 podStartE2EDuration="5.406874328s" podCreationTimestamp="2025-12-02 23:09:20 +0000 UTC" firstStartedPulling="2025-12-02 23:09:23.341034823 +0000 UTC m=+1626.221714834" lastFinishedPulling="2025-12-02 23:09:24.801516205 +0000 UTC m=+1627.682196196" observedRunningTime="2025-12-02 23:09:25.397114917 +0000 UTC m=+1628.277794938" watchObservedRunningTime="2025-12-02 23:09:25.406874328 +0000 UTC m=+1628.287554329" Dec 02 23:09:25 crc kubenswrapper[4696]: I1202 23:09:25.456346 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:25 crc kubenswrapper[4696]: I1202 23:09:25.458201 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m7r24"] Dec 02 23:09:27 crc kubenswrapper[4696]: I1202 23:09:27.450356 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" path="/var/lib/kubelet/pods/2710e5fa-e995-4fcf-b48d-30dd42225dd0/volumes" Dec 02 23:09:31 crc kubenswrapper[4696]: I1202 23:09:31.307438 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:31 crc kubenswrapper[4696]: I1202 23:09:31.308622 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:31 crc kubenswrapper[4696]: I1202 23:09:31.359602 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:31 crc kubenswrapper[4696]: I1202 23:09:31.511185 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:31 crc kubenswrapper[4696]: I1202 23:09:31.605436 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:33 crc kubenswrapper[4696]: I1202 23:09:33.471470 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8t2zs" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="registry-server" containerID="cri-o://1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8" gracePeriod=2 Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.076603 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.229094 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities\") pod \"a1db226d-6e5f-486a-a57d-082d46dfd528\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.229300 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld86x\" (UniqueName: \"kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x\") pod \"a1db226d-6e5f-486a-a57d-082d46dfd528\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.229508 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content\") pod \"a1db226d-6e5f-486a-a57d-082d46dfd528\" (UID: \"a1db226d-6e5f-486a-a57d-082d46dfd528\") " Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.230445 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities" (OuterVolumeSpecName: "utilities") pod "a1db226d-6e5f-486a-a57d-082d46dfd528" (UID: "a1db226d-6e5f-486a-a57d-082d46dfd528"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.237524 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x" (OuterVolumeSpecName: "kube-api-access-ld86x") pod "a1db226d-6e5f-486a-a57d-082d46dfd528" (UID: "a1db226d-6e5f-486a-a57d-082d46dfd528"). InnerVolumeSpecName "kube-api-access-ld86x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.256290 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1db226d-6e5f-486a-a57d-082d46dfd528" (UID: "a1db226d-6e5f-486a-a57d-082d46dfd528"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.333534 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld86x\" (UniqueName: \"kubernetes.io/projected/a1db226d-6e5f-486a-a57d-082d46dfd528-kube-api-access-ld86x\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.333689 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.333712 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1db226d-6e5f-486a-a57d-082d46dfd528-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.493221 4696 generic.go:334] "Generic (PLEG): container finished" podID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerID="1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8" exitCode=0 Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.493278 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerDied","Data":"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8"} Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.493313 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t2zs" event={"ID":"a1db226d-6e5f-486a-a57d-082d46dfd528","Type":"ContainerDied","Data":"2e577a6a7408859dafb411dd25afe565a53b1f436ff1bb627dc9f40a33ac9227"} Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.493345 4696 scope.go:117] "RemoveContainer" containerID="1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.493367 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t2zs" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.523505 4696 scope.go:117] "RemoveContainer" containerID="7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.547073 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.557295 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t2zs"] Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.561179 4696 scope.go:117] "RemoveContainer" containerID="1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.626852 4696 scope.go:117] "RemoveContainer" containerID="1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8" Dec 02 23:09:34 crc kubenswrapper[4696]: E1202 23:09:34.627859 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8\": container with ID starting with 1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8 not found: ID does not exist" containerID="1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.627903 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8"} err="failed to get container status \"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8\": rpc error: code = NotFound desc = could not find container \"1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8\": container with ID starting with 1a51b99bc328cb5775b70933f911ced6d4ece10ebf10c34006cb2213c70a63c8 not found: ID does not exist" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.627934 4696 scope.go:117] "RemoveContainer" containerID="7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6" Dec 02 23:09:34 crc kubenswrapper[4696]: E1202 23:09:34.628373 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6\": container with ID starting with 7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6 not found: ID does not exist" containerID="7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.628450 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6"} err="failed to get container status \"7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6\": rpc error: code = NotFound desc = could not find container \"7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6\": container with ID starting with 7478e15c4d3c15163c2c22dbd982be5f35e54577009f8bcb63007916323e13d6 not found: ID does not exist" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.628500 4696 scope.go:117] "RemoveContainer" containerID="1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8" Dec 02 23:09:34 crc kubenswrapper[4696]: E1202 23:09:34.629050 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8\": container with ID starting with 1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8 not found: ID does not exist" containerID="1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8" Dec 02 23:09:34 crc kubenswrapper[4696]: I1202 23:09:34.629127 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8"} err="failed to get container status \"1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8\": rpc error: code = NotFound desc = could not find container \"1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8\": container with ID starting with 1786bbb6bfb2dc64f3e857de0cdf66dd376a4b884d099dac0445f421def29ba8 not found: ID does not exist" Dec 02 23:09:35 crc kubenswrapper[4696]: I1202 23:09:35.445200 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" path="/var/lib/kubelet/pods/a1db226d-6e5f-486a-a57d-082d46dfd528/volumes" Dec 02 23:09:52 crc kubenswrapper[4696]: I1202 23:09:52.973649 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:09:52 crc kubenswrapper[4696]: I1202 23:09:52.974469 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:10:22 crc kubenswrapper[4696]: I1202 23:10:22.974789 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:10:22 crc kubenswrapper[4696]: I1202 23:10:22.976081 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:10:22 crc kubenswrapper[4696]: I1202 23:10:22.976208 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:10:22 crc kubenswrapper[4696]: I1202 23:10:22.978671 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:10:22 crc kubenswrapper[4696]: I1202 23:10:22.978933 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" gracePeriod=600 Dec 02 23:10:23 crc kubenswrapper[4696]: E1202 23:10:23.117848 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:10:23 crc kubenswrapper[4696]: I1202 23:10:23.208418 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" exitCode=0 Dec 02 23:10:23 crc kubenswrapper[4696]: I1202 23:10:23.208488 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df"} Dec 02 23:10:23 crc kubenswrapper[4696]: I1202 23:10:23.208568 4696 scope.go:117] "RemoveContainer" containerID="d569701d45b5d99a649219a29d07b9038d47beb7daf9fa209eda0483aa45abb9" Dec 02 23:10:23 crc kubenswrapper[4696]: I1202 23:10:23.209486 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:10:23 crc kubenswrapper[4696]: E1202 23:10:23.209788 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:10:36 crc kubenswrapper[4696]: I1202 23:10:36.432104 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:10:36 crc kubenswrapper[4696]: E1202 23:10:36.433402 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.073125 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-v8bm6"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.086635 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-24sjs"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.102219 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-shqwd"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.111368 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ca12-account-create-update-l9pzc"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.120045 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ca12-account-create-update-l9pzc"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.129840 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-shqwd"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.138778 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-v8bm6"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.148444 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-24sjs"] Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.455820 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fdd915-7b63-4733-85bb-06547a93c18a" path="/var/lib/kubelet/pods/01fdd915-7b63-4733-85bb-06547a93c18a/volumes" Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.457079 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613fe2a1-c5b2-460d-8715-040e5c6f4a4a" path="/var/lib/kubelet/pods/613fe2a1-c5b2-460d-8715-040e5c6f4a4a/volumes" Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.458276 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ced705-305f-4c70-80e4-4854e66dabe0" path="/var/lib/kubelet/pods/99ced705-305f-4c70-80e4-4854e66dabe0/volumes" Dec 02 23:10:45 crc kubenswrapper[4696]: I1202 23:10:45.459453 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92140c3-85a2-4b5e-9f2a-604c46b8763f" path="/var/lib/kubelet/pods/e92140c3-85a2-4b5e-9f2a-604c46b8763f/volumes" Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.064019 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vfwzg"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.081845 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vfwzg"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.096254 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-c4e5-account-create-update-j2qmq"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.109547 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-37ae-account-create-update-28whg"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.119426 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bcff-account-create-update-zdbzv"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.129082 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-c4e5-account-create-update-j2qmq"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.138850 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bcff-account-create-update-zdbzv"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.148822 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-37ae-account-create-update-28whg"] Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.606180 4696 generic.go:334] "Generic (PLEG): container finished" podID="10db9578-c367-420b-ba4f-93729e4d9483" containerID="5ea5bda22a8b717be409a23093863c920c5c28c1e5590eb83df653884f600ffd" exitCode=0 Dec 02 23:10:46 crc kubenswrapper[4696]: I1202 23:10:46.606350 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" event={"ID":"10db9578-c367-420b-ba4f-93729e4d9483","Type":"ContainerDied","Data":"5ea5bda22a8b717be409a23093863c920c5c28c1e5590eb83df653884f600ffd"} Dec 02 23:10:47 crc kubenswrapper[4696]: I1202 23:10:47.441213 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:10:47 crc kubenswrapper[4696]: E1202 23:10:47.441557 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:10:47 crc kubenswrapper[4696]: I1202 23:10:47.445165 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219c506b-9aa0-4926-9085-ff99b291382b" path="/var/lib/kubelet/pods/219c506b-9aa0-4926-9085-ff99b291382b/volumes" Dec 02 23:10:47 crc kubenswrapper[4696]: I1202 23:10:47.446126 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769b3c83-63f8-4a20-b62a-6404415cb7de" path="/var/lib/kubelet/pods/769b3c83-63f8-4a20-b62a-6404415cb7de/volumes" Dec 02 23:10:47 crc kubenswrapper[4696]: I1202 23:10:47.446960 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b25ddeb-9306-4ed4-8bd8-b83f9e500985" path="/var/lib/kubelet/pods/8b25ddeb-9306-4ed4-8bd8-b83f9e500985/volumes" Dec 02 23:10:47 crc kubenswrapper[4696]: I1202 23:10:47.447772 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2016daf-7a4a-4f02-b75e-af8116362fe6" path="/var/lib/kubelet/pods/a2016daf-7a4a-4f02-b75e-af8116362fe6/volumes" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.115265 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.168036 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key\") pod \"10db9578-c367-420b-ba4f-93729e4d9483\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.168495 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxsw4\" (UniqueName: \"kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4\") pod \"10db9578-c367-420b-ba4f-93729e4d9483\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.168948 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle\") pod \"10db9578-c367-420b-ba4f-93729e4d9483\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.169105 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory\") pod \"10db9578-c367-420b-ba4f-93729e4d9483\" (UID: \"10db9578-c367-420b-ba4f-93729e4d9483\") " Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.176951 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4" (OuterVolumeSpecName: "kube-api-access-mxsw4") pod "10db9578-c367-420b-ba4f-93729e4d9483" (UID: "10db9578-c367-420b-ba4f-93729e4d9483"). InnerVolumeSpecName "kube-api-access-mxsw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.178226 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "10db9578-c367-420b-ba4f-93729e4d9483" (UID: "10db9578-c367-420b-ba4f-93729e4d9483"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.217023 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10db9578-c367-420b-ba4f-93729e4d9483" (UID: "10db9578-c367-420b-ba4f-93729e4d9483"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.218688 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory" (OuterVolumeSpecName: "inventory") pod "10db9578-c367-420b-ba4f-93729e4d9483" (UID: "10db9578-c367-420b-ba4f-93729e4d9483"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.273476 4696 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.273557 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.273571 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10db9578-c367-420b-ba4f-93729e4d9483-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.273586 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxsw4\" (UniqueName: \"kubernetes.io/projected/10db9578-c367-420b-ba4f-93729e4d9483-kube-api-access-mxsw4\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.629365 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" event={"ID":"10db9578-c367-420b-ba4f-93729e4d9483","Type":"ContainerDied","Data":"b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0"} Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.629829 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80ac0de928627a031f793ce6472107552566652e1ea193dc83d4c9ad6dd59c0" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.629902 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.794480 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2"] Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795006 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10db9578-c367-420b-ba4f-93729e4d9483" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795023 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="10db9578-c367-420b-ba4f-93729e4d9483" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795045 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="extract-content" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795051 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="extract-content" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795063 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795069 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795097 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="extract-utilities" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795103 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="extract-utilities" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795122 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795127 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795133 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="extract-utilities" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795139 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="extract-utilities" Dec 02 23:10:48 crc kubenswrapper[4696]: E1202 23:10:48.795151 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="extract-content" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795157 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="extract-content" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795358 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1db226d-6e5f-486a-a57d-082d46dfd528" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795375 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2710e5fa-e995-4fcf-b48d-30dd42225dd0" containerName="registry-server" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.795386 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="10db9578-c367-420b-ba4f-93729e4d9483" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.796166 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.800495 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.800667 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.800711 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.800784 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.822473 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2"] Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.885184 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.885270 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.885391 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76khl\" (UniqueName: \"kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.987389 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.987502 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.987652 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76khl\" (UniqueName: \"kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.992953 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:48 crc kubenswrapper[4696]: I1202 23:10:48.995500 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:49 crc kubenswrapper[4696]: I1202 23:10:49.019544 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76khl\" (UniqueName: \"kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:49 crc kubenswrapper[4696]: I1202 23:10:49.118728 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:10:49 crc kubenswrapper[4696]: W1202 23:10:49.739038 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29810cf_fd6b_4021_8ae5_52612fb63cfc.slice/crio-c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216 WatchSource:0}: Error finding container c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216: Status 404 returned error can't find the container with id c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216 Dec 02 23:10:49 crc kubenswrapper[4696]: I1202 23:10:49.739110 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2"] Dec 02 23:10:50 crc kubenswrapper[4696]: I1202 23:10:50.666503 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" event={"ID":"a29810cf-fd6b-4021-8ae5-52612fb63cfc","Type":"ContainerStarted","Data":"813f181b6522db642f23804b6bd748eba8920f7dc700ddeaf107882ca9e18c15"} Dec 02 23:10:50 crc kubenswrapper[4696]: I1202 23:10:50.667023 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" event={"ID":"a29810cf-fd6b-4021-8ae5-52612fb63cfc","Type":"ContainerStarted","Data":"c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216"} Dec 02 23:10:50 crc kubenswrapper[4696]: I1202 23:10:50.692703 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" podStartSLOduration=2.200102243 podStartE2EDuration="2.69267814s" podCreationTimestamp="2025-12-02 23:10:48 +0000 UTC" firstStartedPulling="2025-12-02 23:10:49.742329576 +0000 UTC m=+1712.623009577" lastFinishedPulling="2025-12-02 23:10:50.234905473 +0000 UTC m=+1713.115585474" observedRunningTime="2025-12-02 23:10:50.684643094 +0000 UTC m=+1713.565323095" watchObservedRunningTime="2025-12-02 23:10:50.69267814 +0000 UTC m=+1713.573358161" Dec 02 23:10:59 crc kubenswrapper[4696]: I1202 23:10:59.431823 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:10:59 crc kubenswrapper[4696]: E1202 23:10:59.436140 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:11:12 crc kubenswrapper[4696]: I1202 23:11:12.432384 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:11:12 crc kubenswrapper[4696]: E1202 23:11:12.433474 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.058320 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bdhm9"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.078141 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d886-account-create-update-6dn5p"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.100802 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-47n98"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.111720 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5eb9-account-create-update-4prpl"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.121428 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vjwbb"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.130076 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b4df-account-create-update-hxxpx"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.138472 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bdhm9"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.148568 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5eb9-account-create-update-4prpl"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.157687 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-47n98"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.166648 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d886-account-create-update-6dn5p"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.174822 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vjwbb"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.184225 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b4df-account-create-update-hxxpx"] Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.449318 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3149957b-27a4-43d6-af45-b585270f4d47" path="/var/lib/kubelet/pods/3149957b-27a4-43d6-af45-b585270f4d47/volumes" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.450555 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4707bfe2-6206-4fac-b146-f95317884325" path="/var/lib/kubelet/pods/4707bfe2-6206-4fac-b146-f95317884325/volumes" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.451197 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69316d7-55c1-4852-b509-2b3b995fae3a" path="/var/lib/kubelet/pods/a69316d7-55c1-4852-b509-2b3b995fae3a/volumes" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.451894 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1578e8-97dc-4536-bfda-39825192b676" path="/var/lib/kubelet/pods/aa1578e8-97dc-4536-bfda-39825192b676/volumes" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.453798 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1ead2b-f60b-444f-96cd-a9d6cbf89919" path="/var/lib/kubelet/pods/bf1ead2b-f60b-444f-96cd-a9d6cbf89919/volumes" Dec 02 23:11:13 crc kubenswrapper[4696]: I1202 23:11:13.454374 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e60ece-976f-4c14-a4cc-321e29bd5826" path="/var/lib/kubelet/pods/f5e60ece-976f-4c14-a4cc-321e29bd5826/volumes" Dec 02 23:11:26 crc kubenswrapper[4696]: I1202 23:11:26.432508 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:11:26 crc kubenswrapper[4696]: E1202 23:11:26.433800 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.367818 4696 scope.go:117] "RemoveContainer" containerID="ebf5af6ff39872020da9e7c193e0c2efaf73efcd61c553070fe0bc08a934d16b" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.402658 4696 scope.go:117] "RemoveContainer" containerID="3cd406fc581ece9308df691447996117d1226112d3f2f59cc61fee0f48b02e4c" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.470715 4696 scope.go:117] "RemoveContainer" containerID="5d968d24d4c94f5fc8459b593c9d34e74c7800aaae5cf2dc69e5276a23a2bf02" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.527818 4696 scope.go:117] "RemoveContainer" containerID="513fe9cb0db20d1e0524d1e76bbec17f611f4f100bad06222a375baccd4a89a9" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.580615 4696 scope.go:117] "RemoveContainer" containerID="3c522f0b519aed36cf98228fddf6a68e6aefc87af07774c130352064deda1501" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.614753 4696 scope.go:117] "RemoveContainer" containerID="81299c52c8dd2af04d1d83d48a41374885f052d75ba47c70ff000165f3667faa" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.671105 4696 scope.go:117] "RemoveContainer" containerID="14e2ff1d587d5dc37844edb204090172d212f3c8e987fb3512207512ba097cec" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.715475 4696 scope.go:117] "RemoveContainer" containerID="0af465ddc5fbe6eb9804f2509398ddec0355946a8e8d985ee9e5354e98d5b45c" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.740182 4696 scope.go:117] "RemoveContainer" containerID="b93a710a188056df2d6ab5a27117dab4d4cc535bd5a6b7f6f4f539ebc679d19d" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.770691 4696 scope.go:117] "RemoveContainer" containerID="30188643bb3f44c82ffd7efc35c1a01bcabd1efb5b658fe3c26e9cf176a7bd65" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.798274 4696 scope.go:117] "RemoveContainer" containerID="ea9374afdfb098bdf1d68509a2314eb08d0a57a55c6cb0fc98f5d0e9fc9481a4" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.822614 4696 scope.go:117] "RemoveContainer" containerID="5c987e2e45ceb6a881b3f8609fee075f72d0b13986a6ae7e0939be0ea398335e" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.849382 4696 scope.go:117] "RemoveContainer" containerID="be738cddc4863865924a2c01bb3ac8808315f23c435e60c56a08f046aa456f6b" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.878831 4696 scope.go:117] "RemoveContainer" containerID="f55bc4e52aefdaa21fc0f2e2c2ef3eb4bab6add99c1f315980e31ef8f2579758" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.921858 4696 scope.go:117] "RemoveContainer" containerID="002bf9c227b9a1484115b2d2342e45d25be7786dbfdec1c37bb3b0f260de7e2d" Dec 02 23:11:34 crc kubenswrapper[4696]: I1202 23:11:34.951177 4696 scope.go:117] "RemoveContainer" containerID="baf5cc6e3eed3ed647e0c95eb45b61ad1d29ec0d54fd4a87984a96e398a000b2" Dec 02 23:11:38 crc kubenswrapper[4696]: I1202 23:11:38.432059 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:11:38 crc kubenswrapper[4696]: E1202 23:11:38.433000 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:11:44 crc kubenswrapper[4696]: I1202 23:11:44.064381 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lgs5l"] Dec 02 23:11:44 crc kubenswrapper[4696]: I1202 23:11:44.076312 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lgs5l"] Dec 02 23:11:45 crc kubenswrapper[4696]: I1202 23:11:45.451425 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8842e89-ec2d-4601-9e82-b12c1982a910" path="/var/lib/kubelet/pods/a8842e89-ec2d-4601-9e82-b12c1982a910/volumes" Dec 02 23:11:50 crc kubenswrapper[4696]: I1202 23:11:50.073818 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-vfql6"] Dec 02 23:11:50 crc kubenswrapper[4696]: I1202 23:11:50.110184 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-vfql6"] Dec 02 23:11:51 crc kubenswrapper[4696]: I1202 23:11:51.041178 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hf7vx"] Dec 02 23:11:51 crc kubenswrapper[4696]: I1202 23:11:51.051629 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hf7vx"] Dec 02 23:11:51 crc kubenswrapper[4696]: I1202 23:11:51.450200 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa922b0-7645-420b-bd91-ac4f5040d61b" path="/var/lib/kubelet/pods/5fa922b0-7645-420b-bd91-ac4f5040d61b/volumes" Dec 02 23:11:51 crc kubenswrapper[4696]: I1202 23:11:51.451670 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709610e7-e445-4b43-9941-7c08653d3278" path="/var/lib/kubelet/pods/709610e7-e445-4b43-9941-7c08653d3278/volumes" Dec 02 23:11:53 crc kubenswrapper[4696]: I1202 23:11:53.432150 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:11:53 crc kubenswrapper[4696]: E1202 23:11:53.432898 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:12:04 crc kubenswrapper[4696]: I1202 23:12:04.431987 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:12:04 crc kubenswrapper[4696]: E1202 23:12:04.433396 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:12:18 crc kubenswrapper[4696]: I1202 23:12:18.432480 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:12:18 crc kubenswrapper[4696]: E1202 23:12:18.433374 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:12:31 crc kubenswrapper[4696]: I1202 23:12:31.431728 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:12:31 crc kubenswrapper[4696]: E1202 23:12:31.432759 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:12:35 crc kubenswrapper[4696]: I1202 23:12:35.344008 4696 scope.go:117] "RemoveContainer" containerID="8d2f68f1b9bdf446cd938c5fe7b0e9ec244490eec567e2ea2327b48515be07ba" Dec 02 23:12:35 crc kubenswrapper[4696]: I1202 23:12:35.403687 4696 scope.go:117] "RemoveContainer" containerID="e1686d19d55aa5c80b38161546b19b9bf89beb34c52c6a76f3fec22e31517d9c" Dec 02 23:12:35 crc kubenswrapper[4696]: I1202 23:12:35.471783 4696 scope.go:117] "RemoveContainer" containerID="2d7b553213b5397bc12195eef6f7f892e57a46158e9a38414a93fcc7866cb617" Dec 02 23:12:37 crc kubenswrapper[4696]: I1202 23:12:37.998121 4696 generic.go:334] "Generic (PLEG): container finished" podID="a29810cf-fd6b-4021-8ae5-52612fb63cfc" containerID="813f181b6522db642f23804b6bd748eba8920f7dc700ddeaf107882ca9e18c15" exitCode=0 Dec 02 23:12:37 crc kubenswrapper[4696]: I1202 23:12:37.998266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" event={"ID":"a29810cf-fd6b-4021-8ae5-52612fb63cfc","Type":"ContainerDied","Data":"813f181b6522db642f23804b6bd748eba8920f7dc700ddeaf107882ca9e18c15"} Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.057455 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nml6c"] Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.067212 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nml6c"] Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.427150 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.448704 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598ec97e-43b1-4d80-866e-4106d1622140" path="/var/lib/kubelet/pods/598ec97e-43b1-4d80-866e-4106d1622140/volumes" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.472109 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76khl\" (UniqueName: \"kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl\") pod \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.472407 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key\") pod \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.472527 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory\") pod \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\" (UID: \"a29810cf-fd6b-4021-8ae5-52612fb63cfc\") " Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.481393 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl" (OuterVolumeSpecName: "kube-api-access-76khl") pod "a29810cf-fd6b-4021-8ae5-52612fb63cfc" (UID: "a29810cf-fd6b-4021-8ae5-52612fb63cfc"). InnerVolumeSpecName "kube-api-access-76khl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.508990 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory" (OuterVolumeSpecName: "inventory") pod "a29810cf-fd6b-4021-8ae5-52612fb63cfc" (UID: "a29810cf-fd6b-4021-8ae5-52612fb63cfc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.516769 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a29810cf-fd6b-4021-8ae5-52612fb63cfc" (UID: "a29810cf-fd6b-4021-8ae5-52612fb63cfc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.576335 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76khl\" (UniqueName: \"kubernetes.io/projected/a29810cf-fd6b-4021-8ae5-52612fb63cfc-kube-api-access-76khl\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.576392 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:39 crc kubenswrapper[4696]: I1202 23:12:39.576402 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29810cf-fd6b-4021-8ae5-52612fb63cfc-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.025009 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" event={"ID":"a29810cf-fd6b-4021-8ae5-52612fb63cfc","Type":"ContainerDied","Data":"c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216"} Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.025065 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c838f4a885beb9ad5ced680407984d70ae4d81e3b9d9f08e9b76fb9ef8a1c216" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.025784 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.183939 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp"] Dec 02 23:12:40 crc kubenswrapper[4696]: E1202 23:12:40.186881 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29810cf-fd6b-4021-8ae5-52612fb63cfc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.186908 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29810cf-fd6b-4021-8ae5-52612fb63cfc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.187198 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29810cf-fd6b-4021-8ae5-52612fb63cfc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.188164 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.195874 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.195997 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.196233 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.196348 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.218753 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp"] Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.304154 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dnqj\" (UniqueName: \"kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.304237 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.304326 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.406078 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.406232 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.406301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dnqj\" (UniqueName: \"kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.413891 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.414059 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.425518 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dnqj\" (UniqueName: \"kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:40 crc kubenswrapper[4696]: I1202 23:12:40.554037 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:12:41 crc kubenswrapper[4696]: I1202 23:12:41.186002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp"] Dec 02 23:12:42 crc kubenswrapper[4696]: I1202 23:12:42.053505 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" event={"ID":"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e","Type":"ContainerStarted","Data":"54aaaaa3461ec66402ae661ee9230bce5c516f39e4b9df371c5f6d4d55e228b3"} Dec 02 23:12:42 crc kubenswrapper[4696]: I1202 23:12:42.054088 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" event={"ID":"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e","Type":"ContainerStarted","Data":"ed40013b6c45f9ae5d3ccfb9b32153f9b6e45e12b9ffd4b1b318e65243b83d0a"} Dec 02 23:12:42 crc kubenswrapper[4696]: I1202 23:12:42.076466 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" podStartSLOduration=1.5358827860000002 podStartE2EDuration="2.076442349s" podCreationTimestamp="2025-12-02 23:12:40 +0000 UTC" firstStartedPulling="2025-12-02 23:12:41.189202955 +0000 UTC m=+1824.069882966" lastFinishedPulling="2025-12-02 23:12:41.729762528 +0000 UTC m=+1824.610442529" observedRunningTime="2025-12-02 23:12:42.07578153 +0000 UTC m=+1824.956461571" watchObservedRunningTime="2025-12-02 23:12:42.076442349 +0000 UTC m=+1824.957122360" Dec 02 23:12:43 crc kubenswrapper[4696]: I1202 23:12:43.432083 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:12:43 crc kubenswrapper[4696]: E1202 23:12:43.432624 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:12:46 crc kubenswrapper[4696]: I1202 23:12:46.060640 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fnjtl"] Dec 02 23:12:46 crc kubenswrapper[4696]: I1202 23:12:46.074566 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fnjtl"] Dec 02 23:12:47 crc kubenswrapper[4696]: I1202 23:12:47.454409 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f066d064-95ba-42b3-ba9f-5e859533c93c" path="/var/lib/kubelet/pods/f066d064-95ba-42b3-ba9f-5e859533c93c/volumes" Dec 02 23:12:50 crc kubenswrapper[4696]: I1202 23:12:50.037532 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9cwtk"] Dec 02 23:12:50 crc kubenswrapper[4696]: I1202 23:12:50.047824 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9cwtk"] Dec 02 23:12:51 crc kubenswrapper[4696]: I1202 23:12:51.447784 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f5ea7d-03ba-43ab-8863-9547b016bb0a" path="/var/lib/kubelet/pods/d1f5ea7d-03ba-43ab-8863-9547b016bb0a/volumes" Dec 02 23:12:55 crc kubenswrapper[4696]: I1202 23:12:55.433729 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:12:55 crc kubenswrapper[4696]: E1202 23:12:55.437042 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:13:00 crc kubenswrapper[4696]: I1202 23:13:00.038859 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hnh2k"] Dec 02 23:13:00 crc kubenswrapper[4696]: I1202 23:13:00.048867 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hnh2k"] Dec 02 23:13:01 crc kubenswrapper[4696]: I1202 23:13:01.449681 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359eca54-19ad-4e8d-b580-29a37d8f38c8" path="/var/lib/kubelet/pods/359eca54-19ad-4e8d-b580-29a37d8f38c8/volumes" Dec 02 23:13:04 crc kubenswrapper[4696]: I1202 23:13:04.036839 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b2zfs"] Dec 02 23:13:04 crc kubenswrapper[4696]: I1202 23:13:04.045409 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b2zfs"] Dec 02 23:13:05 crc kubenswrapper[4696]: I1202 23:13:05.444182 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143" path="/var/lib/kubelet/pods/b1780f21-2d95-4fdb-9a7c-c2d5b7a9b143/volumes" Dec 02 23:13:09 crc kubenswrapper[4696]: I1202 23:13:09.435105 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:13:09 crc kubenswrapper[4696]: E1202 23:13:09.436313 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:13:21 crc kubenswrapper[4696]: I1202 23:13:21.432825 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:13:21 crc kubenswrapper[4696]: E1202 23:13:21.436086 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.056928 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-eff7-account-create-update-7ks4n"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.068126 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4f32-account-create-update-fg4wn"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.079164 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-svgf5"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.085824 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-111f-account-create-update-j4mlh"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.098441 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-eff7-account-create-update-7ks4n"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.106775 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tkf2x"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.115864 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-svgf5"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.127360 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4f32-account-create-update-fg4wn"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.139174 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tkf2x"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.153122 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tc9xp"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.167522 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-111f-account-create-update-j4mlh"] Dec 02 23:13:34 crc kubenswrapper[4696]: I1202 23:13:34.177197 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tc9xp"] Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.432401 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:13:35 crc kubenswrapper[4696]: E1202 23:13:35.433376 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.449207 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1172a4a0-f1c5-49f2-b91a-e691b431c471" path="/var/lib/kubelet/pods/1172a4a0-f1c5-49f2-b91a-e691b431c471/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.450137 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3c7135-a850-4cd0-b5b6-2561b75cd09b" path="/var/lib/kubelet/pods/5d3c7135-a850-4cd0-b5b6-2561b75cd09b/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.450943 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4f26b7-e810-4c1e-98b1-57fc3b417e60" path="/var/lib/kubelet/pods/9f4f26b7-e810-4c1e-98b1-57fc3b417e60/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.451717 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4756827-862e-446f-a149-b3a541a656b5" path="/var/lib/kubelet/pods/a4756827-862e-446f-a149-b3a541a656b5/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.453253 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d71614-bd9c-4b09-b813-d9d6f01fdc92" path="/var/lib/kubelet/pods/b9d71614-bd9c-4b09-b813-d9d6f01fdc92/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.454025 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a" path="/var/lib/kubelet/pods/ec5ac1a2-65d0-4222-8e6e-cce2d0fe6d5a/volumes" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.614955 4696 scope.go:117] "RemoveContainer" containerID="171777d49f633d478ff4c9cd39f56f6babed8df8c66ecb007794fd2897eff333" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.672581 4696 scope.go:117] "RemoveContainer" containerID="92ccdcd4cebba044da3501499bb42eb368c1a67fbdbca8e88c12c3bdc628b413" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.707164 4696 scope.go:117] "RemoveContainer" containerID="ad01d3112109f96a6d9cd82bace3c2485a8f8a27bf776427a1996a19ea50e8ee" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.779787 4696 scope.go:117] "RemoveContainer" containerID="17296d467a9aa446ef6f6eb8a02686b348027b45542b32c459fae79438f20459" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.827692 4696 scope.go:117] "RemoveContainer" containerID="4ca47c290ac0c7eaea73a12196ebd3a5d3997faa9cdc63ad37d6e2326b0492a7" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.875306 4696 scope.go:117] "RemoveContainer" containerID="dffc07df9b78ef87e3c8ee7f5071ef38e9dd2d9b298f7b929cd9b62013f4afab" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.937267 4696 scope.go:117] "RemoveContainer" containerID="2466eab4b8fe1cd3b46530dc9514800eb7a59e245135327216bd1f748df79e1d" Dec 02 23:13:35 crc kubenswrapper[4696]: I1202 23:13:35.961262 4696 scope.go:117] "RemoveContainer" containerID="13b6ba914c44051cdda92cf0f314254cff50d93b66d42259c8ace70da7096401" Dec 02 23:13:36 crc kubenswrapper[4696]: I1202 23:13:36.002902 4696 scope.go:117] "RemoveContainer" containerID="35ee60b6b7516dbeb7b9afc433010c0563b80e80aa7e5ad5dfe21a670e0e4290" Dec 02 23:13:36 crc kubenswrapper[4696]: I1202 23:13:36.056337 4696 scope.go:117] "RemoveContainer" containerID="8a7494df1d3e76227818c7a36e371393352e7e15f56ad38eb20d721912e42181" Dec 02 23:13:36 crc kubenswrapper[4696]: I1202 23:13:36.097843 4696 scope.go:117] "RemoveContainer" containerID="0f20a6b2e9ef1d274da099a3b52a5fee338af3dcd01d0138a163ada2bde5cfe0" Dec 02 23:13:49 crc kubenswrapper[4696]: I1202 23:13:49.434446 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:13:49 crc kubenswrapper[4696]: E1202 23:13:49.436860 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:14:01 crc kubenswrapper[4696]: I1202 23:14:01.432083 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:14:01 crc kubenswrapper[4696]: E1202 23:14:01.433354 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:14:03 crc kubenswrapper[4696]: I1202 23:14:03.107367 4696 generic.go:334] "Generic (PLEG): container finished" podID="38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" containerID="54aaaaa3461ec66402ae661ee9230bce5c516f39e4b9df371c5f6d4d55e228b3" exitCode=0 Dec 02 23:14:03 crc kubenswrapper[4696]: I1202 23:14:03.107418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" event={"ID":"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e","Type":"ContainerDied","Data":"54aaaaa3461ec66402ae661ee9230bce5c516f39e4b9df371c5f6d4d55e228b3"} Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.051658 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89vr8"] Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.070772 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89vr8"] Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.682586 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.865369 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dnqj\" (UniqueName: \"kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj\") pod \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.865759 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory\") pod \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.865804 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key\") pod \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\" (UID: \"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e\") " Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.874341 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj" (OuterVolumeSpecName: "kube-api-access-4dnqj") pod "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" (UID: "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e"). InnerVolumeSpecName "kube-api-access-4dnqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.903577 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory" (OuterVolumeSpecName: "inventory") pod "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" (UID: "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.915291 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" (UID: "38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.967960 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.968000 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:04 crc kubenswrapper[4696]: I1202 23:14:04.968014 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dnqj\" (UniqueName: \"kubernetes.io/projected/38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e-kube-api-access-4dnqj\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.138562 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" event={"ID":"38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e","Type":"ContainerDied","Data":"ed40013b6c45f9ae5d3ccfb9b32153f9b6e45e12b9ffd4b1b318e65243b83d0a"} Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.138644 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed40013b6c45f9ae5d3ccfb9b32153f9b6e45e12b9ffd4b1b318e65243b83d0a" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.138699 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.255727 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw"] Dec 02 23:14:05 crc kubenswrapper[4696]: E1202 23:14:05.256858 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.256899 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.257331 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.258999 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.262422 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.262579 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.264161 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.264434 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.267273 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw"] Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.378488 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.378616 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.378866 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hqg\" (UniqueName: \"kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.450290 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27" path="/var/lib/kubelet/pods/8677bbf5-15c6-4e35-98ce-ba3b8ccd5a27/volumes" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.482331 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.482950 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.483145 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hqg\" (UniqueName: \"kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.488810 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.488896 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.505280 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hqg\" (UniqueName: \"kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:05 crc kubenswrapper[4696]: I1202 23:14:05.600404 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:06 crc kubenswrapper[4696]: I1202 23:14:06.192661 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw"] Dec 02 23:14:07 crc kubenswrapper[4696]: I1202 23:14:07.167123 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" event={"ID":"130877a2-12e6-4731-9f64-675fcfd8a1ce","Type":"ContainerStarted","Data":"8ab84a3f968330fb60e5c03e2e9640990228ca5ae6f474f5478e6e7b84b5c461"} Dec 02 23:14:07 crc kubenswrapper[4696]: I1202 23:14:07.167711 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" event={"ID":"130877a2-12e6-4731-9f64-675fcfd8a1ce","Type":"ContainerStarted","Data":"28d616b53c9d6509618ef906069688c011fb6b1066fc00e2c4faa9b952ff6161"} Dec 02 23:14:07 crc kubenswrapper[4696]: I1202 23:14:07.194698 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" podStartSLOduration=1.771744906 podStartE2EDuration="2.194671519s" podCreationTimestamp="2025-12-02 23:14:05 +0000 UTC" firstStartedPulling="2025-12-02 23:14:06.213148424 +0000 UTC m=+1909.093828415" lastFinishedPulling="2025-12-02 23:14:06.636075027 +0000 UTC m=+1909.516755028" observedRunningTime="2025-12-02 23:14:07.18761284 +0000 UTC m=+1910.068292841" watchObservedRunningTime="2025-12-02 23:14:07.194671519 +0000 UTC m=+1910.075351520" Dec 02 23:14:12 crc kubenswrapper[4696]: I1202 23:14:12.431723 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:14:12 crc kubenswrapper[4696]: E1202 23:14:12.432975 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:14:13 crc kubenswrapper[4696]: I1202 23:14:13.239421 4696 generic.go:334] "Generic (PLEG): container finished" podID="130877a2-12e6-4731-9f64-675fcfd8a1ce" containerID="8ab84a3f968330fb60e5c03e2e9640990228ca5ae6f474f5478e6e7b84b5c461" exitCode=0 Dec 02 23:14:13 crc kubenswrapper[4696]: I1202 23:14:13.239544 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" event={"ID":"130877a2-12e6-4731-9f64-675fcfd8a1ce","Type":"ContainerDied","Data":"8ab84a3f968330fb60e5c03e2e9640990228ca5ae6f474f5478e6e7b84b5c461"} Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.730042 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.913249 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6hqg\" (UniqueName: \"kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg\") pod \"130877a2-12e6-4731-9f64-675fcfd8a1ce\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.913367 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory\") pod \"130877a2-12e6-4731-9f64-675fcfd8a1ce\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.913486 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key\") pod \"130877a2-12e6-4731-9f64-675fcfd8a1ce\" (UID: \"130877a2-12e6-4731-9f64-675fcfd8a1ce\") " Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.920290 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg" (OuterVolumeSpecName: "kube-api-access-q6hqg") pod "130877a2-12e6-4731-9f64-675fcfd8a1ce" (UID: "130877a2-12e6-4731-9f64-675fcfd8a1ce"). InnerVolumeSpecName "kube-api-access-q6hqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.965628 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory" (OuterVolumeSpecName: "inventory") pod "130877a2-12e6-4731-9f64-675fcfd8a1ce" (UID: "130877a2-12e6-4731-9f64-675fcfd8a1ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:14:14 crc kubenswrapper[4696]: I1202 23:14:14.969358 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "130877a2-12e6-4731-9f64-675fcfd8a1ce" (UID: "130877a2-12e6-4731-9f64-675fcfd8a1ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.016642 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.016686 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6hqg\" (UniqueName: \"kubernetes.io/projected/130877a2-12e6-4731-9f64-675fcfd8a1ce-kube-api-access-q6hqg\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.016705 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130877a2-12e6-4731-9f64-675fcfd8a1ce-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.265019 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" event={"ID":"130877a2-12e6-4731-9f64-675fcfd8a1ce","Type":"ContainerDied","Data":"28d616b53c9d6509618ef906069688c011fb6b1066fc00e2c4faa9b952ff6161"} Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.265067 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d616b53c9d6509618ef906069688c011fb6b1066fc00e2c4faa9b952ff6161" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.265162 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.381900 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r"] Dec 02 23:14:15 crc kubenswrapper[4696]: E1202 23:14:15.382511 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130877a2-12e6-4731-9f64-675fcfd8a1ce" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.382535 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="130877a2-12e6-4731-9f64-675fcfd8a1ce" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.382867 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="130877a2-12e6-4731-9f64-675fcfd8a1ce" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.383935 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.387571 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.387880 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.390331 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.390713 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.399702 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r"] Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.425222 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkr9m\" (UniqueName: \"kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.425293 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.425505 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.527692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.527801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkr9m\" (UniqueName: \"kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.527826 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.532923 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.533115 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.549175 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkr9m\" (UniqueName: \"kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sw6r\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:15 crc kubenswrapper[4696]: I1202 23:14:15.706407 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:14:16 crc kubenswrapper[4696]: I1202 23:14:16.256890 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r"] Dec 02 23:14:16 crc kubenswrapper[4696]: I1202 23:14:16.267285 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:14:16 crc kubenswrapper[4696]: I1202 23:14:16.278124 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" event={"ID":"f616e70d-4131-4ed5-b891-33dcad6a8827","Type":"ContainerStarted","Data":"be8a0d79e287ac65e4d8bcbed10ddb93875ff7d1edb4c218b9e14aa3133b2e46"} Dec 02 23:14:17 crc kubenswrapper[4696]: I1202 23:14:17.317269 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" event={"ID":"f616e70d-4131-4ed5-b891-33dcad6a8827","Type":"ContainerStarted","Data":"bb538c23311ac50dbed21a28140cbb36a23539e2083c47889276a17e1edb3c23"} Dec 02 23:14:17 crc kubenswrapper[4696]: I1202 23:14:17.350084 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" podStartSLOduration=1.957487457 podStartE2EDuration="2.350047864s" podCreationTimestamp="2025-12-02 23:14:15 +0000 UTC" firstStartedPulling="2025-12-02 23:14:16.266964273 +0000 UTC m=+1919.147644264" lastFinishedPulling="2025-12-02 23:14:16.65952466 +0000 UTC m=+1919.540204671" observedRunningTime="2025-12-02 23:14:17.336992176 +0000 UTC m=+1920.217672187" watchObservedRunningTime="2025-12-02 23:14:17.350047864 +0000 UTC m=+1920.230727875" Dec 02 23:14:27 crc kubenswrapper[4696]: I1202 23:14:27.441574 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:14:27 crc kubenswrapper[4696]: E1202 23:14:27.442971 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:14:31 crc kubenswrapper[4696]: I1202 23:14:31.045676 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gfgfp"] Dec 02 23:14:31 crc kubenswrapper[4696]: I1202 23:14:31.057017 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gfgfp"] Dec 02 23:14:31 crc kubenswrapper[4696]: I1202 23:14:31.444286 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c71f81a-6ed2-41fa-9600-f5afbeee2653" path="/var/lib/kubelet/pods/3c71f81a-6ed2-41fa-9600-f5afbeee2653/volumes" Dec 02 23:14:33 crc kubenswrapper[4696]: I1202 23:14:33.045554 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7fk6v"] Dec 02 23:14:33 crc kubenswrapper[4696]: I1202 23:14:33.060674 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7fk6v"] Dec 02 23:14:33 crc kubenswrapper[4696]: I1202 23:14:33.448609 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95634dd9-11ed-4c9d-b0e3-b7240ff94ac0" path="/var/lib/kubelet/pods/95634dd9-11ed-4c9d-b0e3-b7240ff94ac0/volumes" Dec 02 23:14:36 crc kubenswrapper[4696]: I1202 23:14:36.429088 4696 scope.go:117] "RemoveContainer" containerID="ebbd9aa7d3c954796178286243dd7c13228fce4d1dddc62cbd72b83aa1198d04" Dec 02 23:14:36 crc kubenswrapper[4696]: I1202 23:14:36.503480 4696 scope.go:117] "RemoveContainer" containerID="83c9852211162805e83ff9321f542ffdd13012ef0a1e8f10ee979a73dc9fa17e" Dec 02 23:14:36 crc kubenswrapper[4696]: I1202 23:14:36.581763 4696 scope.go:117] "RemoveContainer" containerID="e3925fad130fa205634297b3447a41f37b659cd7947b3e48e6dd4fd618123680" Dec 02 23:14:38 crc kubenswrapper[4696]: I1202 23:14:38.432049 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:14:38 crc kubenswrapper[4696]: E1202 23:14:38.433231 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:14:51 crc kubenswrapper[4696]: I1202 23:14:51.432279 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:14:51 crc kubenswrapper[4696]: E1202 23:14:51.434677 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.179986 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7"] Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.182763 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.191542 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.194247 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.198293 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7"] Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.305949 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.306672 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.306803 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.409717 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.409847 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.409920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.411463 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.421465 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.444813 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf\") pod \"collect-profiles-29411955-pvdn7\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:00 crc kubenswrapper[4696]: I1202 23:15:00.508799 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:01 crc kubenswrapper[4696]: I1202 23:15:01.044497 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7"] Dec 02 23:15:01 crc kubenswrapper[4696]: I1202 23:15:01.827544 4696 generic.go:334] "Generic (PLEG): container finished" podID="70cd65fe-f070-4d40-aa5b-dc5568eca34e" containerID="5a6192be7462a74e57388633d4a3a462044a84efd0e4091053ff6e1973bc0fab" exitCode=0 Dec 02 23:15:01 crc kubenswrapper[4696]: I1202 23:15:01.827640 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" event={"ID":"70cd65fe-f070-4d40-aa5b-dc5568eca34e","Type":"ContainerDied","Data":"5a6192be7462a74e57388633d4a3a462044a84efd0e4091053ff6e1973bc0fab"} Dec 02 23:15:01 crc kubenswrapper[4696]: I1202 23:15:01.828012 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" event={"ID":"70cd65fe-f070-4d40-aa5b-dc5568eca34e","Type":"ContainerStarted","Data":"56a519e736352f1aafb374ebb1f5fb541b29a99c44c2b8b0fb82900979eff26c"} Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.176298 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.278296 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume\") pod \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.278594 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf\") pod \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.278699 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume\") pod \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\" (UID: \"70cd65fe-f070-4d40-aa5b-dc5568eca34e\") " Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.279275 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume" (OuterVolumeSpecName: "config-volume") pod "70cd65fe-f070-4d40-aa5b-dc5568eca34e" (UID: "70cd65fe-f070-4d40-aa5b-dc5568eca34e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.279392 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70cd65fe-f070-4d40-aa5b-dc5568eca34e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.286679 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf" (OuterVolumeSpecName: "kube-api-access-wdjnf") pod "70cd65fe-f070-4d40-aa5b-dc5568eca34e" (UID: "70cd65fe-f070-4d40-aa5b-dc5568eca34e"). InnerVolumeSpecName "kube-api-access-wdjnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.287068 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70cd65fe-f070-4d40-aa5b-dc5568eca34e" (UID: "70cd65fe-f070-4d40-aa5b-dc5568eca34e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.382567 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/70cd65fe-f070-4d40-aa5b-dc5568eca34e-kube-api-access-wdjnf\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.382628 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70cd65fe-f070-4d40-aa5b-dc5568eca34e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.432228 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:15:03 crc kubenswrapper[4696]: E1202 23:15:03.432585 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.852865 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" event={"ID":"70cd65fe-f070-4d40-aa5b-dc5568eca34e","Type":"ContainerDied","Data":"56a519e736352f1aafb374ebb1f5fb541b29a99c44c2b8b0fb82900979eff26c"} Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.852918 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a519e736352f1aafb374ebb1f5fb541b29a99c44c2b8b0fb82900979eff26c" Dec 02 23:15:03 crc kubenswrapper[4696]: I1202 23:15:03.852945 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7" Dec 02 23:15:04 crc kubenswrapper[4696]: I1202 23:15:04.280379 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z"] Dec 02 23:15:04 crc kubenswrapper[4696]: I1202 23:15:04.293135 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411910-cc46z"] Dec 02 23:15:04 crc kubenswrapper[4696]: I1202 23:15:04.870900 4696 generic.go:334] "Generic (PLEG): container finished" podID="f616e70d-4131-4ed5-b891-33dcad6a8827" containerID="bb538c23311ac50dbed21a28140cbb36a23539e2083c47889276a17e1edb3c23" exitCode=0 Dec 02 23:15:04 crc kubenswrapper[4696]: I1202 23:15:04.871003 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" event={"ID":"f616e70d-4131-4ed5-b891-33dcad6a8827","Type":"ContainerDied","Data":"bb538c23311ac50dbed21a28140cbb36a23539e2083c47889276a17e1edb3c23"} Dec 02 23:15:05 crc kubenswrapper[4696]: I1202 23:15:05.483924 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e8a28b-6bd6-4100-a3e7-80faf9aaeeef" path="/var/lib/kubelet/pods/65e8a28b-6bd6-4100-a3e7-80faf9aaeeef/volumes" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.311625 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.358780 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key\") pod \"f616e70d-4131-4ed5-b891-33dcad6a8827\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.359000 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory\") pod \"f616e70d-4131-4ed5-b891-33dcad6a8827\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.359165 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkr9m\" (UniqueName: \"kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m\") pod \"f616e70d-4131-4ed5-b891-33dcad6a8827\" (UID: \"f616e70d-4131-4ed5-b891-33dcad6a8827\") " Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.367098 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m" (OuterVolumeSpecName: "kube-api-access-fkr9m") pod "f616e70d-4131-4ed5-b891-33dcad6a8827" (UID: "f616e70d-4131-4ed5-b891-33dcad6a8827"). InnerVolumeSpecName "kube-api-access-fkr9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.399283 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f616e70d-4131-4ed5-b891-33dcad6a8827" (UID: "f616e70d-4131-4ed5-b891-33dcad6a8827"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.402200 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory" (OuterVolumeSpecName: "inventory") pod "f616e70d-4131-4ed5-b891-33dcad6a8827" (UID: "f616e70d-4131-4ed5-b891-33dcad6a8827"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.462054 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.462092 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkr9m\" (UniqueName: \"kubernetes.io/projected/f616e70d-4131-4ed5-b891-33dcad6a8827-kube-api-access-fkr9m\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.462105 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f616e70d-4131-4ed5-b891-33dcad6a8827-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.893029 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" event={"ID":"f616e70d-4131-4ed5-b891-33dcad6a8827","Type":"ContainerDied","Data":"be8a0d79e287ac65e4d8bcbed10ddb93875ff7d1edb4c218b9e14aa3133b2e46"} Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.893472 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8a0d79e287ac65e4d8bcbed10ddb93875ff7d1edb4c218b9e14aa3133b2e46" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.893093 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sw6r" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.993355 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz"] Dec 02 23:15:06 crc kubenswrapper[4696]: E1202 23:15:06.993863 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f616e70d-4131-4ed5-b891-33dcad6a8827" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.993891 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f616e70d-4131-4ed5-b891-33dcad6a8827" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:15:06 crc kubenswrapper[4696]: E1202 23:15:06.993917 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70cd65fe-f070-4d40-aa5b-dc5568eca34e" containerName="collect-profiles" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.993926 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="70cd65fe-f070-4d40-aa5b-dc5568eca34e" containerName="collect-profiles" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.994200 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="70cd65fe-f070-4d40-aa5b-dc5568eca34e" containerName="collect-profiles" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.994244 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f616e70d-4131-4ed5-b891-33dcad6a8827" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:15:06 crc kubenswrapper[4696]: I1202 23:15:06.995199 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.001343 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.002173 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.003136 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.003433 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.014962 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz"] Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.078904 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.078969 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrst\" (UniqueName: \"kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.079437 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.182423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.182653 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrst\" (UniqueName: \"kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.182687 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.189876 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.190525 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.205999 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrst\" (UniqueName: \"kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.316849 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.878039 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz"] Dec 02 23:15:07 crc kubenswrapper[4696]: I1202 23:15:07.908941 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" event={"ID":"4e57d59f-2b48-457e-92dd-d0585bab85b5","Type":"ContainerStarted","Data":"fd44388efc92d4a0811a4326f8bf9bd932981f58e6dc20ac281570e773091526"} Dec 02 23:15:08 crc kubenswrapper[4696]: I1202 23:15:08.925966 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" event={"ID":"4e57d59f-2b48-457e-92dd-d0585bab85b5","Type":"ContainerStarted","Data":"9424c6ad1328256280c00aa8e1abacd4ea45ddbd147af842afd1a2ecef45f2df"} Dec 02 23:15:08 crc kubenswrapper[4696]: I1202 23:15:08.950113 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" podStartSLOduration=2.437664325 podStartE2EDuration="2.950087244s" podCreationTimestamp="2025-12-02 23:15:06 +0000 UTC" firstStartedPulling="2025-12-02 23:15:07.883002716 +0000 UTC m=+1970.763682767" lastFinishedPulling="2025-12-02 23:15:08.395425675 +0000 UTC m=+1971.276105686" observedRunningTime="2025-12-02 23:15:08.943295443 +0000 UTC m=+1971.823975454" watchObservedRunningTime="2025-12-02 23:15:08.950087244 +0000 UTC m=+1971.830767255" Dec 02 23:15:18 crc kubenswrapper[4696]: I1202 23:15:18.056646 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7rj2"] Dec 02 23:15:18 crc kubenswrapper[4696]: I1202 23:15:18.076215 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7rj2"] Dec 02 23:15:18 crc kubenswrapper[4696]: I1202 23:15:18.432429 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:15:18 crc kubenswrapper[4696]: E1202 23:15:18.432828 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:15:19 crc kubenswrapper[4696]: I1202 23:15:19.454103 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb6c8ad-6b5e-4f55-8495-c2c30699ab91" path="/var/lib/kubelet/pods/0bb6c8ad-6b5e-4f55-8495-c2c30699ab91/volumes" Dec 02 23:15:31 crc kubenswrapper[4696]: I1202 23:15:31.448969 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:15:32 crc kubenswrapper[4696]: I1202 23:15:32.228991 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea"} Dec 02 23:15:36 crc kubenswrapper[4696]: I1202 23:15:36.716654 4696 scope.go:117] "RemoveContainer" containerID="84c52015fd31b437c81535cddc8184cea36367585fd397974fad2816bf56a69f" Dec 02 23:15:36 crc kubenswrapper[4696]: I1202 23:15:36.764878 4696 scope.go:117] "RemoveContainer" containerID="d3594e65bdd0ccdd9d25e7dc2a126340ad3c8383049dfc3129919c4b2f4436b8" Dec 02 23:15:36 crc kubenswrapper[4696]: I1202 23:15:36.798678 4696 scope.go:117] "RemoveContainer" containerID="d578a50080f50ef7f54dce9d91a6f8b7b76cae522bce55baacb541cdf8b17f9a" Dec 02 23:15:36 crc kubenswrapper[4696]: I1202 23:15:36.879122 4696 scope.go:117] "RemoveContainer" containerID="7d2d52569cbe53f8a460cd185908e062d2042b228e7891be95952b9aaa6b57bc" Dec 02 23:15:36 crc kubenswrapper[4696]: I1202 23:15:36.904251 4696 scope.go:117] "RemoveContainer" containerID="ba2992bf060ed354ca9bfa75b836d171ec07c0c17e73ae1445d0b89f9aacb65b" Dec 02 23:15:56 crc kubenswrapper[4696]: I1202 23:15:56.868913 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:15:56 crc kubenswrapper[4696]: I1202 23:15:56.876472 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:56 crc kubenswrapper[4696]: I1202 23:15:56.892138 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.024233 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.024727 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.024934 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svcrt\" (UniqueName: \"kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.127354 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.127456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.127495 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svcrt\" (UniqueName: \"kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.128600 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.128668 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.149031 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svcrt\" (UniqueName: \"kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt\") pod \"redhat-operators-2qqwv\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.208263 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:15:57 crc kubenswrapper[4696]: I1202 23:15:57.774600 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:15:58 crc kubenswrapper[4696]: I1202 23:15:58.579696 4696 generic.go:334] "Generic (PLEG): container finished" podID="45b532a6-a149-4e88-894c-173548cfcdc6" containerID="f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29" exitCode=0 Dec 02 23:15:58 crc kubenswrapper[4696]: I1202 23:15:58.579768 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerDied","Data":"f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29"} Dec 02 23:15:58 crc kubenswrapper[4696]: I1202 23:15:58.580223 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerStarted","Data":"fab31766c95fd3d1c636946a2aa5dd39f577fc7bbc2e93de13faba807ce13035"} Dec 02 23:15:59 crc kubenswrapper[4696]: I1202 23:15:59.591044 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerStarted","Data":"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a"} Dec 02 23:16:00 crc kubenswrapper[4696]: I1202 23:16:00.607620 4696 generic.go:334] "Generic (PLEG): container finished" podID="45b532a6-a149-4e88-894c-173548cfcdc6" containerID="8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a" exitCode=0 Dec 02 23:16:00 crc kubenswrapper[4696]: I1202 23:16:00.607732 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerDied","Data":"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a"} Dec 02 23:16:01 crc kubenswrapper[4696]: I1202 23:16:01.620992 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerStarted","Data":"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7"} Dec 02 23:16:01 crc kubenswrapper[4696]: I1202 23:16:01.646873 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qqwv" podStartSLOduration=3.225780995 podStartE2EDuration="5.646852328s" podCreationTimestamp="2025-12-02 23:15:56 +0000 UTC" firstStartedPulling="2025-12-02 23:15:58.584541752 +0000 UTC m=+2021.465221753" lastFinishedPulling="2025-12-02 23:16:01.005613085 +0000 UTC m=+2023.886293086" observedRunningTime="2025-12-02 23:16:01.640824028 +0000 UTC m=+2024.521504029" watchObservedRunningTime="2025-12-02 23:16:01.646852328 +0000 UTC m=+2024.527532329" Dec 02 23:16:07 crc kubenswrapper[4696]: I1202 23:16:07.208872 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:07 crc kubenswrapper[4696]: I1202 23:16:07.209976 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:07 crc kubenswrapper[4696]: I1202 23:16:07.264614 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:07 crc kubenswrapper[4696]: I1202 23:16:07.752323 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:07 crc kubenswrapper[4696]: I1202 23:16:07.813852 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:16:09 crc kubenswrapper[4696]: I1202 23:16:09.718437 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qqwv" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="registry-server" containerID="cri-o://83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7" gracePeriod=2 Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.240252 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.358440 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svcrt\" (UniqueName: \"kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt\") pod \"45b532a6-a149-4e88-894c-173548cfcdc6\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.358506 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content\") pod \"45b532a6-a149-4e88-894c-173548cfcdc6\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.358528 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities\") pod \"45b532a6-a149-4e88-894c-173548cfcdc6\" (UID: \"45b532a6-a149-4e88-894c-173548cfcdc6\") " Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.359824 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities" (OuterVolumeSpecName: "utilities") pod "45b532a6-a149-4e88-894c-173548cfcdc6" (UID: "45b532a6-a149-4e88-894c-173548cfcdc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.361013 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.368158 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt" (OuterVolumeSpecName: "kube-api-access-svcrt") pod "45b532a6-a149-4e88-894c-173548cfcdc6" (UID: "45b532a6-a149-4e88-894c-173548cfcdc6"). InnerVolumeSpecName "kube-api-access-svcrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.463641 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svcrt\" (UniqueName: \"kubernetes.io/projected/45b532a6-a149-4e88-894c-173548cfcdc6-kube-api-access-svcrt\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.501911 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45b532a6-a149-4e88-894c-173548cfcdc6" (UID: "45b532a6-a149-4e88-894c-173548cfcdc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.567048 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b532a6-a149-4e88-894c-173548cfcdc6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.743845 4696 generic.go:334] "Generic (PLEG): container finished" podID="45b532a6-a149-4e88-894c-173548cfcdc6" containerID="83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7" exitCode=0 Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.743910 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerDied","Data":"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7"} Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.744002 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qqwv" event={"ID":"45b532a6-a149-4e88-894c-173548cfcdc6","Type":"ContainerDied","Data":"fab31766c95fd3d1c636946a2aa5dd39f577fc7bbc2e93de13faba807ce13035"} Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.744028 4696 scope.go:117] "RemoveContainer" containerID="83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.743921 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qqwv" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.790102 4696 scope.go:117] "RemoveContainer" containerID="8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.807723 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.824338 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qqwv"] Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.830441 4696 scope.go:117] "RemoveContainer" containerID="f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.864124 4696 scope.go:117] "RemoveContainer" containerID="83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7" Dec 02 23:16:10 crc kubenswrapper[4696]: E1202 23:16:10.864866 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7\": container with ID starting with 83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7 not found: ID does not exist" containerID="83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.864911 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7"} err="failed to get container status \"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7\": rpc error: code = NotFound desc = could not find container \"83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7\": container with ID starting with 83c061e495d7c260d6486e86c081d4afc49e74e00a296d865094f54ca42508c7 not found: ID does not exist" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.864941 4696 scope.go:117] "RemoveContainer" containerID="8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a" Dec 02 23:16:10 crc kubenswrapper[4696]: E1202 23:16:10.865477 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a\": container with ID starting with 8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a not found: ID does not exist" containerID="8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.865506 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a"} err="failed to get container status \"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a\": rpc error: code = NotFound desc = could not find container \"8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a\": container with ID starting with 8e4b81c5af2578d555cd205c444ca90f16afb64d4f59a6b79fe6d9c9bd162c9a not found: ID does not exist" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.865528 4696 scope.go:117] "RemoveContainer" containerID="f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29" Dec 02 23:16:10 crc kubenswrapper[4696]: E1202 23:16:10.865863 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29\": container with ID starting with f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29 not found: ID does not exist" containerID="f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29" Dec 02 23:16:10 crc kubenswrapper[4696]: I1202 23:16:10.865919 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29"} err="failed to get container status \"f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29\": rpc error: code = NotFound desc = could not find container \"f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29\": container with ID starting with f31d2bedd49b76afd25d059b3c31d6f61857fccbb4a55d06d553b1350b3ecd29 not found: ID does not exist" Dec 02 23:16:11 crc kubenswrapper[4696]: I1202 23:16:11.447031 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" path="/var/lib/kubelet/pods/45b532a6-a149-4e88-894c-173548cfcdc6/volumes" Dec 02 23:16:12 crc kubenswrapper[4696]: I1202 23:16:12.770105 4696 generic.go:334] "Generic (PLEG): container finished" podID="4e57d59f-2b48-457e-92dd-d0585bab85b5" containerID="9424c6ad1328256280c00aa8e1abacd4ea45ddbd147af842afd1a2ecef45f2df" exitCode=0 Dec 02 23:16:12 crc kubenswrapper[4696]: I1202 23:16:12.770159 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" event={"ID":"4e57d59f-2b48-457e-92dd-d0585bab85b5","Type":"ContainerDied","Data":"9424c6ad1328256280c00aa8e1abacd4ea45ddbd147af842afd1a2ecef45f2df"} Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.231157 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.384796 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key\") pod \"4e57d59f-2b48-457e-92dd-d0585bab85b5\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.384917 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rrst\" (UniqueName: \"kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst\") pod \"4e57d59f-2b48-457e-92dd-d0585bab85b5\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.385013 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory\") pod \"4e57d59f-2b48-457e-92dd-d0585bab85b5\" (UID: \"4e57d59f-2b48-457e-92dd-d0585bab85b5\") " Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.393592 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst" (OuterVolumeSpecName: "kube-api-access-7rrst") pod "4e57d59f-2b48-457e-92dd-d0585bab85b5" (UID: "4e57d59f-2b48-457e-92dd-d0585bab85b5"). InnerVolumeSpecName "kube-api-access-7rrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.436028 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e57d59f-2b48-457e-92dd-d0585bab85b5" (UID: "4e57d59f-2b48-457e-92dd-d0585bab85b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.436957 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory" (OuterVolumeSpecName: "inventory") pod "4e57d59f-2b48-457e-92dd-d0585bab85b5" (UID: "4e57d59f-2b48-457e-92dd-d0585bab85b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.490814 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.490872 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e57d59f-2b48-457e-92dd-d0585bab85b5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.490884 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rrst\" (UniqueName: \"kubernetes.io/projected/4e57d59f-2b48-457e-92dd-d0585bab85b5-kube-api-access-7rrst\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.798483 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" event={"ID":"4e57d59f-2b48-457e-92dd-d0585bab85b5","Type":"ContainerDied","Data":"fd44388efc92d4a0811a4326f8bf9bd932981f58e6dc20ac281570e773091526"} Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.798538 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd44388efc92d4a0811a4326f8bf9bd932981f58e6dc20ac281570e773091526" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.798593 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.920280 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dkdq9"] Dec 02 23:16:14 crc kubenswrapper[4696]: E1202 23:16:14.920883 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e57d59f-2b48-457e-92dd-d0585bab85b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.920911 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e57d59f-2b48-457e-92dd-d0585bab85b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:14 crc kubenswrapper[4696]: E1202 23:16:14.920937 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="extract-utilities" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.920947 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="extract-utilities" Dec 02 23:16:14 crc kubenswrapper[4696]: E1202 23:16:14.920981 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="extract-content" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.920989 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="extract-content" Dec 02 23:16:14 crc kubenswrapper[4696]: E1202 23:16:14.921021 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="registry-server" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.921031 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="registry-server" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.921288 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e57d59f-2b48-457e-92dd-d0585bab85b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.921313 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b532a6-a149-4e88-894c-173548cfcdc6" containerName="registry-server" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.922341 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.930646 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.930757 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.930830 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.932889 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:16:14 crc kubenswrapper[4696]: I1202 23:16:14.942664 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dkdq9"] Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.104809 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhcrx\" (UniqueName: \"kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.105106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.105157 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.207803 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhcrx\" (UniqueName: \"kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.208102 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.209013 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.215201 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.216051 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.239089 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhcrx\" (UniqueName: \"kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx\") pod \"ssh-known-hosts-edpm-deployment-dkdq9\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.246687 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:15 crc kubenswrapper[4696]: I1202 23:16:15.865682 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dkdq9"] Dec 02 23:16:16 crc kubenswrapper[4696]: I1202 23:16:16.822767 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" event={"ID":"540fd942-5964-4e7f-a40f-66102876bd8c","Type":"ContainerStarted","Data":"8ab32addafad59023105b338a0dcb5293051f76a3617c2071637205c87c679a8"} Dec 02 23:16:16 crc kubenswrapper[4696]: I1202 23:16:16.823316 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" event={"ID":"540fd942-5964-4e7f-a40f-66102876bd8c","Type":"ContainerStarted","Data":"c39e51da8861b087c31cca4e0a11bb28bfed376b29d12b4699fb4bc3dc38678e"} Dec 02 23:16:16 crc kubenswrapper[4696]: I1202 23:16:16.863105 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" podStartSLOduration=2.446380012 podStartE2EDuration="2.86307601s" podCreationTimestamp="2025-12-02 23:16:14 +0000 UTC" firstStartedPulling="2025-12-02 23:16:15.87885087 +0000 UTC m=+2038.759530911" lastFinishedPulling="2025-12-02 23:16:16.295546878 +0000 UTC m=+2039.176226909" observedRunningTime="2025-12-02 23:16:16.85705229 +0000 UTC m=+2039.737732301" watchObservedRunningTime="2025-12-02 23:16:16.86307601 +0000 UTC m=+2039.743756021" Dec 02 23:16:24 crc kubenswrapper[4696]: I1202 23:16:24.934925 4696 generic.go:334] "Generic (PLEG): container finished" podID="540fd942-5964-4e7f-a40f-66102876bd8c" containerID="8ab32addafad59023105b338a0dcb5293051f76a3617c2071637205c87c679a8" exitCode=0 Dec 02 23:16:24 crc kubenswrapper[4696]: I1202 23:16:24.935068 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" event={"ID":"540fd942-5964-4e7f-a40f-66102876bd8c","Type":"ContainerDied","Data":"8ab32addafad59023105b338a0dcb5293051f76a3617c2071637205c87c679a8"} Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.552833 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.636166 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0\") pod \"540fd942-5964-4e7f-a40f-66102876bd8c\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.636251 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam\") pod \"540fd942-5964-4e7f-a40f-66102876bd8c\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.636291 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhcrx\" (UniqueName: \"kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx\") pod \"540fd942-5964-4e7f-a40f-66102876bd8c\" (UID: \"540fd942-5964-4e7f-a40f-66102876bd8c\") " Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.643313 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx" (OuterVolumeSpecName: "kube-api-access-rhcrx") pod "540fd942-5964-4e7f-a40f-66102876bd8c" (UID: "540fd942-5964-4e7f-a40f-66102876bd8c"). InnerVolumeSpecName "kube-api-access-rhcrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.670763 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "540fd942-5964-4e7f-a40f-66102876bd8c" (UID: "540fd942-5964-4e7f-a40f-66102876bd8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.671665 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "540fd942-5964-4e7f-a40f-66102876bd8c" (UID: "540fd942-5964-4e7f-a40f-66102876bd8c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.738341 4696 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.738407 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/540fd942-5964-4e7f-a40f-66102876bd8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.738419 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhcrx\" (UniqueName: \"kubernetes.io/projected/540fd942-5964-4e7f-a40f-66102876bd8c-kube-api-access-rhcrx\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.957952 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" event={"ID":"540fd942-5964-4e7f-a40f-66102876bd8c","Type":"ContainerDied","Data":"c39e51da8861b087c31cca4e0a11bb28bfed376b29d12b4699fb4bc3dc38678e"} Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.958009 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dkdq9" Dec 02 23:16:26 crc kubenswrapper[4696]: I1202 23:16:26.958012 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c39e51da8861b087c31cca4e0a11bb28bfed376b29d12b4699fb4bc3dc38678e" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.088116 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7"] Dec 02 23:16:27 crc kubenswrapper[4696]: E1202 23:16:27.089379 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540fd942-5964-4e7f-a40f-66102876bd8c" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.089693 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="540fd942-5964-4e7f-a40f-66102876bd8c" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.090173 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="540fd942-5964-4e7f-a40f-66102876bd8c" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.091670 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.094790 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.095274 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.095810 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.095829 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.101807 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7"] Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.146393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.146561 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6w9b\" (UniqueName: \"kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.146618 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.247500 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6w9b\" (UniqueName: \"kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.247577 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.247648 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.252948 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.258303 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.276405 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6w9b\" (UniqueName: \"kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6qlp7\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:27 crc kubenswrapper[4696]: I1202 23:16:27.413157 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:28 crc kubenswrapper[4696]: I1202 23:16:28.026346 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7"] Dec 02 23:16:29 crc kubenswrapper[4696]: I1202 23:16:29.001204 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" event={"ID":"96d33f70-c859-4df1-9e0c-94fa64d60a41","Type":"ContainerStarted","Data":"812ea4678f779d8fe4eabd0f45a2582405f7bd93ef549b200c216f52bdbf5e32"} Dec 02 23:16:29 crc kubenswrapper[4696]: I1202 23:16:29.001813 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" event={"ID":"96d33f70-c859-4df1-9e0c-94fa64d60a41","Type":"ContainerStarted","Data":"7776edab7fa8cb6a0cf5d184c5e43e352745d6552217b24326203fcaf5da3e18"} Dec 02 23:16:29 crc kubenswrapper[4696]: I1202 23:16:29.033646 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" podStartSLOduration=1.638056584 podStartE2EDuration="2.033628326s" podCreationTimestamp="2025-12-02 23:16:27 +0000 UTC" firstStartedPulling="2025-12-02 23:16:28.034872215 +0000 UTC m=+2050.915552216" lastFinishedPulling="2025-12-02 23:16:28.430443947 +0000 UTC m=+2051.311123958" observedRunningTime="2025-12-02 23:16:29.027950016 +0000 UTC m=+2051.908630017" watchObservedRunningTime="2025-12-02 23:16:29.033628326 +0000 UTC m=+2051.914308317" Dec 02 23:16:40 crc kubenswrapper[4696]: I1202 23:16:40.147953 4696 generic.go:334] "Generic (PLEG): container finished" podID="96d33f70-c859-4df1-9e0c-94fa64d60a41" containerID="812ea4678f779d8fe4eabd0f45a2582405f7bd93ef549b200c216f52bdbf5e32" exitCode=0 Dec 02 23:16:40 crc kubenswrapper[4696]: I1202 23:16:40.148106 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" event={"ID":"96d33f70-c859-4df1-9e0c-94fa64d60a41","Type":"ContainerDied","Data":"812ea4678f779d8fe4eabd0f45a2582405f7bd93ef549b200c216f52bdbf5e32"} Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.614438 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.724547 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key\") pod \"96d33f70-c859-4df1-9e0c-94fa64d60a41\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.724635 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory\") pod \"96d33f70-c859-4df1-9e0c-94fa64d60a41\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.724683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6w9b\" (UniqueName: \"kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b\") pod \"96d33f70-c859-4df1-9e0c-94fa64d60a41\" (UID: \"96d33f70-c859-4df1-9e0c-94fa64d60a41\") " Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.732726 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b" (OuterVolumeSpecName: "kube-api-access-k6w9b") pod "96d33f70-c859-4df1-9e0c-94fa64d60a41" (UID: "96d33f70-c859-4df1-9e0c-94fa64d60a41"). InnerVolumeSpecName "kube-api-access-k6w9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.757903 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96d33f70-c859-4df1-9e0c-94fa64d60a41" (UID: "96d33f70-c859-4df1-9e0c-94fa64d60a41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.771319 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory" (OuterVolumeSpecName: "inventory") pod "96d33f70-c859-4df1-9e0c-94fa64d60a41" (UID: "96d33f70-c859-4df1-9e0c-94fa64d60a41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.826944 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.826983 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d33f70-c859-4df1-9e0c-94fa64d60a41-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4696]: I1202 23:16:41.826993 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6w9b\" (UniqueName: \"kubernetes.io/projected/96d33f70-c859-4df1-9e0c-94fa64d60a41-kube-api-access-k6w9b\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.187598 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" event={"ID":"96d33f70-c859-4df1-9e0c-94fa64d60a41","Type":"ContainerDied","Data":"7776edab7fa8cb6a0cf5d184c5e43e352745d6552217b24326203fcaf5da3e18"} Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.187671 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7776edab7fa8cb6a0cf5d184c5e43e352745d6552217b24326203fcaf5da3e18" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.187780 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6qlp7" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.269132 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb"] Dec 02 23:16:42 crc kubenswrapper[4696]: E1202 23:16:42.269637 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d33f70-c859-4df1-9e0c-94fa64d60a41" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.269659 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d33f70-c859-4df1-9e0c-94fa64d60a41" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.269876 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d33f70-c859-4df1-9e0c-94fa64d60a41" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.272405 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.280302 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.280441 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.280703 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.280945 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.288461 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb"] Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.338439 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvx87\" (UniqueName: \"kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.338624 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.338674 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.441297 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvx87\" (UniqueName: \"kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.441962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.442764 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.446553 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.446582 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.460379 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvx87\" (UniqueName: \"kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:42 crc kubenswrapper[4696]: I1202 23:16:42.600809 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:43 crc kubenswrapper[4696]: I1202 23:16:43.342309 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb"] Dec 02 23:16:44 crc kubenswrapper[4696]: I1202 23:16:44.214078 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" event={"ID":"00408801-09ea-4d50-a657-b01117a2f51b","Type":"ContainerStarted","Data":"bed0e4e258a72372497655dd47aa1a98e4845a79b6513dcc481318454a4b69b9"} Dec 02 23:16:44 crc kubenswrapper[4696]: I1202 23:16:44.214544 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" event={"ID":"00408801-09ea-4d50-a657-b01117a2f51b","Type":"ContainerStarted","Data":"2d481e286d4baad74df871705fe8a7e2de93bc997225154eaf384df684fa0441"} Dec 02 23:16:44 crc kubenswrapper[4696]: I1202 23:16:44.244797 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" podStartSLOduration=1.7210179079999999 podStartE2EDuration="2.244764696s" podCreationTimestamp="2025-12-02 23:16:42 +0000 UTC" firstStartedPulling="2025-12-02 23:16:43.350565715 +0000 UTC m=+2066.231245716" lastFinishedPulling="2025-12-02 23:16:43.874312503 +0000 UTC m=+2066.754992504" observedRunningTime="2025-12-02 23:16:44.234592319 +0000 UTC m=+2067.115272330" watchObservedRunningTime="2025-12-02 23:16:44.244764696 +0000 UTC m=+2067.125444737" Dec 02 23:16:55 crc kubenswrapper[4696]: I1202 23:16:55.352758 4696 generic.go:334] "Generic (PLEG): container finished" podID="00408801-09ea-4d50-a657-b01117a2f51b" containerID="bed0e4e258a72372497655dd47aa1a98e4845a79b6513dcc481318454a4b69b9" exitCode=0 Dec 02 23:16:55 crc kubenswrapper[4696]: I1202 23:16:55.352834 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" event={"ID":"00408801-09ea-4d50-a657-b01117a2f51b","Type":"ContainerDied","Data":"bed0e4e258a72372497655dd47aa1a98e4845a79b6513dcc481318454a4b69b9"} Dec 02 23:16:56 crc kubenswrapper[4696]: I1202 23:16:56.884271 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.010683 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory\") pod \"00408801-09ea-4d50-a657-b01117a2f51b\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.010752 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key\") pod \"00408801-09ea-4d50-a657-b01117a2f51b\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.011124 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvx87\" (UniqueName: \"kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87\") pod \"00408801-09ea-4d50-a657-b01117a2f51b\" (UID: \"00408801-09ea-4d50-a657-b01117a2f51b\") " Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.019043 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87" (OuterVolumeSpecName: "kube-api-access-mvx87") pod "00408801-09ea-4d50-a657-b01117a2f51b" (UID: "00408801-09ea-4d50-a657-b01117a2f51b"). InnerVolumeSpecName "kube-api-access-mvx87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.042948 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory" (OuterVolumeSpecName: "inventory") pod "00408801-09ea-4d50-a657-b01117a2f51b" (UID: "00408801-09ea-4d50-a657-b01117a2f51b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.059785 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "00408801-09ea-4d50-a657-b01117a2f51b" (UID: "00408801-09ea-4d50-a657-b01117a2f51b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.114149 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvx87\" (UniqueName: \"kubernetes.io/projected/00408801-09ea-4d50-a657-b01117a2f51b-kube-api-access-mvx87\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.114205 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.114219 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00408801-09ea-4d50-a657-b01117a2f51b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.373782 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" event={"ID":"00408801-09ea-4d50-a657-b01117a2f51b","Type":"ContainerDied","Data":"2d481e286d4baad74df871705fe8a7e2de93bc997225154eaf384df684fa0441"} Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.374279 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d481e286d4baad74df871705fe8a7e2de93bc997225154eaf384df684fa0441" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.374189 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.511602 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b"] Dec 02 23:16:57 crc kubenswrapper[4696]: E1202 23:16:57.512507 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00408801-09ea-4d50-a657-b01117a2f51b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.512559 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="00408801-09ea-4d50-a657-b01117a2f51b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.514010 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="00408801-09ea-4d50-a657-b01117a2f51b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.514941 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.518831 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.519379 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.520038 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.520177 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.520243 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.520460 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.520605 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.529391 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.529484 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b"] Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.624816 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.624919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.624972 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.624993 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmf8\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625017 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625067 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625100 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625168 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625204 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625234 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625297 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625317 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625336 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.625395 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727684 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727768 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727797 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727834 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727876 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727936 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.727995 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728016 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmf8\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728051 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728077 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728178 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728209 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.728249 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.733707 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.734128 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.735833 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.737198 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.737610 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.737781 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.737806 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.739106 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.739980 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.741870 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.742391 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.745044 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.740155 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.755693 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmf8\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:57 crc kubenswrapper[4696]: I1202 23:16:57.846506 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:16:58 crc kubenswrapper[4696]: I1202 23:16:58.438828 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b"] Dec 02 23:16:59 crc kubenswrapper[4696]: I1202 23:16:59.395622 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" event={"ID":"d3d600ee-2200-406f-8d8b-f093851161fd","Type":"ContainerStarted","Data":"7d9a6fb692d65318b2cb870cd306d3ef7d4fbc88cff2697cf42b9ec95f45bb0d"} Dec 02 23:16:59 crc kubenswrapper[4696]: I1202 23:16:59.396136 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" event={"ID":"d3d600ee-2200-406f-8d8b-f093851161fd","Type":"ContainerStarted","Data":"3c5aa47a1e76fb01f263df5017aaa3b85b4562dd227717a4518b52f45172ad99"} Dec 02 23:17:46 crc kubenswrapper[4696]: I1202 23:17:46.908147 4696 generic.go:334] "Generic (PLEG): container finished" podID="d3d600ee-2200-406f-8d8b-f093851161fd" containerID="7d9a6fb692d65318b2cb870cd306d3ef7d4fbc88cff2697cf42b9ec95f45bb0d" exitCode=0 Dec 02 23:17:46 crc kubenswrapper[4696]: I1202 23:17:46.908244 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" event={"ID":"d3d600ee-2200-406f-8d8b-f093851161fd","Type":"ContainerDied","Data":"7d9a6fb692d65318b2cb870cd306d3ef7d4fbc88cff2697cf42b9ec95f45bb0d"} Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.376481 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.442792 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.443565 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.443672 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.443801 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.443877 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvmf8\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.443939 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444088 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444148 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444271 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444456 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444562 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.444667 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.446038 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.446110 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d3d600ee-2200-406f-8d8b-f093851161fd\" (UID: \"d3d600ee-2200-406f-8d8b-f093851161fd\") " Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.452837 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.454657 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8" (OuterVolumeSpecName: "kube-api-access-lvmf8") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "kube-api-access-lvmf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.455095 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.456387 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.456740 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.458324 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.459581 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.460187 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.460801 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.461304 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.461970 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.463312 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.486708 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.493359 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory" (OuterVolumeSpecName: "inventory") pod "d3d600ee-2200-406f-8d8b-f093851161fd" (UID: "d3d600ee-2200-406f-8d8b-f093851161fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551411 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551457 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551475 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551494 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551508 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551523 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551539 4696 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551553 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551568 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551585 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvmf8\" (UniqueName: \"kubernetes.io/projected/d3d600ee-2200-406f-8d8b-f093851161fd-kube-api-access-lvmf8\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551598 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551611 4696 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551623 4696 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.551637 4696 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d600ee-2200-406f-8d8b-f093851161fd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.938177 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" event={"ID":"d3d600ee-2200-406f-8d8b-f093851161fd","Type":"ContainerDied","Data":"3c5aa47a1e76fb01f263df5017aaa3b85b4562dd227717a4518b52f45172ad99"} Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.938242 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5aa47a1e76fb01f263df5017aaa3b85b4562dd227717a4518b52f45172ad99" Dec 02 23:17:48 crc kubenswrapper[4696]: I1202 23:17:48.938327 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.083295 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49"] Dec 02 23:17:49 crc kubenswrapper[4696]: E1202 23:17:49.084133 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d600ee-2200-406f-8d8b-f093851161fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.084160 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d600ee-2200-406f-8d8b-f093851161fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.084422 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d600ee-2200-406f-8d8b-f093851161fd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.085266 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.093341 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.094313 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.094904 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.094952 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.095326 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.118807 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49"] Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.167648 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.167693 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.167751 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.167866 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.167938 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.269670 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.269823 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.269891 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.269920 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.269962 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.271386 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.278799 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.279665 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.280514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.301151 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d7r49\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.427299 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.822325 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49"] Dec 02 23:17:49 crc kubenswrapper[4696]: I1202 23:17:49.962092 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" event={"ID":"9ac484be-e201-4b74-a21e-502131efc1e3","Type":"ContainerStarted","Data":"5091221d1ef8dbb54779a748f9ae65920c821d2cf0359ee627e9f2104ce28b15"} Dec 02 23:17:50 crc kubenswrapper[4696]: I1202 23:17:50.983167 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" event={"ID":"9ac484be-e201-4b74-a21e-502131efc1e3","Type":"ContainerStarted","Data":"c6fdcf544eee544d81180d5d2716eb0f242c2beebb89e550509b8ac1795feaf6"} Dec 02 23:17:51 crc kubenswrapper[4696]: I1202 23:17:51.021982 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" podStartSLOduration=1.4410337979999999 podStartE2EDuration="2.021946203s" podCreationTimestamp="2025-12-02 23:17:49 +0000 UTC" firstStartedPulling="2025-12-02 23:17:49.829918877 +0000 UTC m=+2132.710598878" lastFinishedPulling="2025-12-02 23:17:50.410831272 +0000 UTC m=+2133.291511283" observedRunningTime="2025-12-02 23:17:51.009367146 +0000 UTC m=+2133.890047157" watchObservedRunningTime="2025-12-02 23:17:51.021946203 +0000 UTC m=+2133.902626214" Dec 02 23:17:52 crc kubenswrapper[4696]: I1202 23:17:52.974095 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:17:52 crc kubenswrapper[4696]: I1202 23:17:52.974513 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:18:22 crc kubenswrapper[4696]: I1202 23:18:22.974064 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:18:22 crc kubenswrapper[4696]: I1202 23:18:22.974968 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:18:52 crc kubenswrapper[4696]: I1202 23:18:52.973730 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:18:52 crc kubenswrapper[4696]: I1202 23:18:52.974483 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:18:52 crc kubenswrapper[4696]: I1202 23:18:52.974540 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:18:52 crc kubenswrapper[4696]: I1202 23:18:52.975062 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:18:52 crc kubenswrapper[4696]: I1202 23:18:52.975117 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea" gracePeriod=600 Dec 02 23:18:53 crc kubenswrapper[4696]: I1202 23:18:53.736858 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea" exitCode=0 Dec 02 23:18:53 crc kubenswrapper[4696]: I1202 23:18:53.738101 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea"} Dec 02 23:18:53 crc kubenswrapper[4696]: I1202 23:18:53.738159 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23"} Dec 02 23:18:53 crc kubenswrapper[4696]: I1202 23:18:53.738203 4696 scope.go:117] "RemoveContainer" containerID="631e32e77a29a8c8bd80f1cb8c03794b8f094b77b9ba9fa687732346c1ca36df" Dec 02 23:19:07 crc kubenswrapper[4696]: E1202 23:19:07.588345 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ac484be_e201_4b74_a21e_502131efc1e3.slice/crio-conmon-c6fdcf544eee544d81180d5d2716eb0f242c2beebb89e550509b8ac1795feaf6.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:19:07 crc kubenswrapper[4696]: I1202 23:19:07.910030 4696 generic.go:334] "Generic (PLEG): container finished" podID="9ac484be-e201-4b74-a21e-502131efc1e3" containerID="c6fdcf544eee544d81180d5d2716eb0f242c2beebb89e550509b8ac1795feaf6" exitCode=0 Dec 02 23:19:07 crc kubenswrapper[4696]: I1202 23:19:07.910118 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" event={"ID":"9ac484be-e201-4b74-a21e-502131efc1e3","Type":"ContainerDied","Data":"c6fdcf544eee544d81180d5d2716eb0f242c2beebb89e550509b8ac1795feaf6"} Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.373958 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.377307 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.402945 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.509343 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.509468 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjns\" (UniqueName: \"kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.509547 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.611246 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.611377 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.611482 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjns\" (UniqueName: \"kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.613939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.614508 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.639061 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjns\" (UniqueName: \"kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns\") pod \"community-operators-kfbjb\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:08 crc kubenswrapper[4696]: I1202 23:19:08.711187 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.314200 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.538879 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.637064 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key\") pod \"9ac484be-e201-4b74-a21e-502131efc1e3\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.637190 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle\") pod \"9ac484be-e201-4b74-a21e-502131efc1e3\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.637264 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0\") pod \"9ac484be-e201-4b74-a21e-502131efc1e3\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.637371 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory\") pod \"9ac484be-e201-4b74-a21e-502131efc1e3\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.637419 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k\") pod \"9ac484be-e201-4b74-a21e-502131efc1e3\" (UID: \"9ac484be-e201-4b74-a21e-502131efc1e3\") " Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.645810 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k" (OuterVolumeSpecName: "kube-api-access-4qb2k") pod "9ac484be-e201-4b74-a21e-502131efc1e3" (UID: "9ac484be-e201-4b74-a21e-502131efc1e3"). InnerVolumeSpecName "kube-api-access-4qb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.645953 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ac484be-e201-4b74-a21e-502131efc1e3" (UID: "9ac484be-e201-4b74-a21e-502131efc1e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.671761 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory" (OuterVolumeSpecName: "inventory") pod "9ac484be-e201-4b74-a21e-502131efc1e3" (UID: "9ac484be-e201-4b74-a21e-502131efc1e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.672311 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9ac484be-e201-4b74-a21e-502131efc1e3" (UID: "9ac484be-e201-4b74-a21e-502131efc1e3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.689397 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ac484be-e201-4b74-a21e-502131efc1e3" (UID: "9ac484be-e201-4b74-a21e-502131efc1e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.741572 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.741627 4696 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.741652 4696 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ac484be-e201-4b74-a21e-502131efc1e3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.741671 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac484be-e201-4b74-a21e-502131efc1e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.741691 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/9ac484be-e201-4b74-a21e-502131efc1e3-kube-api-access-4qb2k\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.984008 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" event={"ID":"9ac484be-e201-4b74-a21e-502131efc1e3","Type":"ContainerDied","Data":"5091221d1ef8dbb54779a748f9ae65920c821d2cf0359ee627e9f2104ce28b15"} Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.984070 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5091221d1ef8dbb54779a748f9ae65920c821d2cf0359ee627e9f2104ce28b15" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.984155 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d7r49" Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.991996 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerID="e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275" exitCode=0 Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.992158 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerDied","Data":"e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275"} Dec 02 23:19:09 crc kubenswrapper[4696]: I1202 23:19:09.992253 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerStarted","Data":"f78022cfc900520890a679898bb1f6669eb104791fc89e573c38e2524506d021"} Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.054260 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg"] Dec 02 23:19:10 crc kubenswrapper[4696]: E1202 23:19:10.054872 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac484be-e201-4b74-a21e-502131efc1e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.054894 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac484be-e201-4b74-a21e-502131efc1e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.055139 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac484be-e201-4b74-a21e-502131efc1e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.056241 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.062460 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.062702 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.063049 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.063213 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.063327 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.063468 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.073657 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg"] Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.152421 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.152492 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sztbh\" (UniqueName: \"kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.152533 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.152653 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.152712 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.153025 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.255861 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.256001 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.256034 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sztbh\" (UniqueName: \"kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.256064 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.256123 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.256164 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.270450 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.270635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.271588 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.271627 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.271802 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.274882 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sztbh\" (UniqueName: \"kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:10 crc kubenswrapper[4696]: I1202 23:19:10.391471 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:19:11 crc kubenswrapper[4696]: I1202 23:19:11.027680 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg"] Dec 02 23:19:11 crc kubenswrapper[4696]: W1202 23:19:11.030514 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a46783_f2f6_464b_a1cd_d859d59e0c99.slice/crio-96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac WatchSource:0}: Error finding container 96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac: Status 404 returned error can't find the container with id 96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac Dec 02 23:19:12 crc kubenswrapper[4696]: I1202 23:19:12.018825 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" event={"ID":"81a46783-f2f6-464b-a1cd-d859d59e0c99","Type":"ContainerStarted","Data":"5cd2d4fce9aadcdbfe9ce0f0506f7138aaecb095c45b3849f401ba5cd7c74bc6"} Dec 02 23:19:12 crc kubenswrapper[4696]: I1202 23:19:12.019266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" event={"ID":"81a46783-f2f6-464b-a1cd-d859d59e0c99","Type":"ContainerStarted","Data":"96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac"} Dec 02 23:19:12 crc kubenswrapper[4696]: I1202 23:19:12.023628 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerID="03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b" exitCode=0 Dec 02 23:19:12 crc kubenswrapper[4696]: I1202 23:19:12.023720 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerDied","Data":"03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b"} Dec 02 23:19:12 crc kubenswrapper[4696]: I1202 23:19:12.056031 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" podStartSLOduration=1.594077121 podStartE2EDuration="2.056008558s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="2025-12-02 23:19:11.033523693 +0000 UTC m=+2213.914203694" lastFinishedPulling="2025-12-02 23:19:11.49545512 +0000 UTC m=+2214.376135131" observedRunningTime="2025-12-02 23:19:12.043395271 +0000 UTC m=+2214.924075272" watchObservedRunningTime="2025-12-02 23:19:12.056008558 +0000 UTC m=+2214.936688549" Dec 02 23:19:13 crc kubenswrapper[4696]: I1202 23:19:13.035710 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerStarted","Data":"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69"} Dec 02 23:19:13 crc kubenswrapper[4696]: I1202 23:19:13.069175 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kfbjb" podStartSLOduration=2.580984711 podStartE2EDuration="5.069138659s" podCreationTimestamp="2025-12-02 23:19:08 +0000 UTC" firstStartedPulling="2025-12-02 23:19:09.995297991 +0000 UTC m=+2212.875978002" lastFinishedPulling="2025-12-02 23:19:12.483451949 +0000 UTC m=+2215.364131950" observedRunningTime="2025-12-02 23:19:13.057615073 +0000 UTC m=+2215.938295094" watchObservedRunningTime="2025-12-02 23:19:13.069138659 +0000 UTC m=+2215.949818660" Dec 02 23:19:18 crc kubenswrapper[4696]: I1202 23:19:18.712047 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:18 crc kubenswrapper[4696]: I1202 23:19:18.712809 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:18 crc kubenswrapper[4696]: I1202 23:19:18.800434 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:19 crc kubenswrapper[4696]: I1202 23:19:19.171549 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:19 crc kubenswrapper[4696]: I1202 23:19:19.229092 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.155894 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kfbjb" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="registry-server" containerID="cri-o://68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69" gracePeriod=2 Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.714883 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.852823 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities\") pod \"8f0a2c58-bcd7-4052-af4b-d70614271129\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.853119 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content\") pod \"8f0a2c58-bcd7-4052-af4b-d70614271129\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.853283 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mjns\" (UniqueName: \"kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns\") pod \"8f0a2c58-bcd7-4052-af4b-d70614271129\" (UID: \"8f0a2c58-bcd7-4052-af4b-d70614271129\") " Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.854912 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities" (OuterVolumeSpecName: "utilities") pod "8f0a2c58-bcd7-4052-af4b-d70614271129" (UID: "8f0a2c58-bcd7-4052-af4b-d70614271129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.861469 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns" (OuterVolumeSpecName: "kube-api-access-8mjns") pod "8f0a2c58-bcd7-4052-af4b-d70614271129" (UID: "8f0a2c58-bcd7-4052-af4b-d70614271129"). InnerVolumeSpecName "kube-api-access-8mjns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.955877 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.955915 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mjns\" (UniqueName: \"kubernetes.io/projected/8f0a2c58-bcd7-4052-af4b-d70614271129-kube-api-access-8mjns\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4696]: I1202 23:19:21.964605 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f0a2c58-bcd7-4052-af4b-d70614271129" (UID: "8f0a2c58-bcd7-4052-af4b-d70614271129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.058781 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0a2c58-bcd7-4052-af4b-d70614271129-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.173042 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerID="68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69" exitCode=0 Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.173125 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbjb" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.173137 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerDied","Data":"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69"} Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.173684 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbjb" event={"ID":"8f0a2c58-bcd7-4052-af4b-d70614271129","Type":"ContainerDied","Data":"f78022cfc900520890a679898bb1f6669eb104791fc89e573c38e2524506d021"} Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.173706 4696 scope.go:117] "RemoveContainer" containerID="68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.215806 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.221581 4696 scope.go:117] "RemoveContainer" containerID="03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.226477 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kfbjb"] Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.248257 4696 scope.go:117] "RemoveContainer" containerID="e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.297619 4696 scope.go:117] "RemoveContainer" containerID="68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69" Dec 02 23:19:22 crc kubenswrapper[4696]: E1202 23:19:22.298205 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69\": container with ID starting with 68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69 not found: ID does not exist" containerID="68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.298256 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69"} err="failed to get container status \"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69\": rpc error: code = NotFound desc = could not find container \"68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69\": container with ID starting with 68058f42ae9052b44b901869cf641bb6884477f235d6ee954b0e6ce6a3c94a69 not found: ID does not exist" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.298291 4696 scope.go:117] "RemoveContainer" containerID="03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b" Dec 02 23:19:22 crc kubenswrapper[4696]: E1202 23:19:22.298696 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b\": container with ID starting with 03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b not found: ID does not exist" containerID="03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.298723 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b"} err="failed to get container status \"03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b\": rpc error: code = NotFound desc = could not find container \"03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b\": container with ID starting with 03b086e1b8961921212fcef92017fa06774e99326f3458b57b4e2098cfca542b not found: ID does not exist" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.298757 4696 scope.go:117] "RemoveContainer" containerID="e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275" Dec 02 23:19:22 crc kubenswrapper[4696]: E1202 23:19:22.298974 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275\": container with ID starting with e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275 not found: ID does not exist" containerID="e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275" Dec 02 23:19:22 crc kubenswrapper[4696]: I1202 23:19:22.299010 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275"} err="failed to get container status \"e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275\": rpc error: code = NotFound desc = could not find container \"e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275\": container with ID starting with e30ebf5377e7ec86b268f16e32ab4dcd862420ddcfef62610f95957ddf87e275 not found: ID does not exist" Dec 02 23:19:23 crc kubenswrapper[4696]: I1202 23:19:23.446423 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" path="/var/lib/kubelet/pods/8f0a2c58-bcd7-4052-af4b-d70614271129/volumes" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.271566 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:19:56 crc kubenswrapper[4696]: E1202 23:19:56.273484 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="registry-server" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.273515 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="registry-server" Dec 02 23:19:56 crc kubenswrapper[4696]: E1202 23:19:56.273588 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="extract-content" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.273602 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="extract-content" Dec 02 23:19:56 crc kubenswrapper[4696]: E1202 23:19:56.273644 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="extract-utilities" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.273659 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="extract-utilities" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.274080 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0a2c58-bcd7-4052-af4b-d70614271129" containerName="registry-server" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.279634 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.283337 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.425075 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.425228 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmdd\" (UniqueName: \"kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.425379 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.527925 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srmdd\" (UniqueName: \"kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.528153 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.528301 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.528903 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.529952 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.560581 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmdd\" (UniqueName: \"kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd\") pod \"certified-operators-5s5p4\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:56 crc kubenswrapper[4696]: I1202 23:19:56.610946 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:19:57 crc kubenswrapper[4696]: I1202 23:19:57.217412 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:19:57 crc kubenswrapper[4696]: I1202 23:19:57.616617 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerID="aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8" exitCode=0 Dec 02 23:19:57 crc kubenswrapper[4696]: I1202 23:19:57.616726 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerDied","Data":"aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8"} Dec 02 23:19:57 crc kubenswrapper[4696]: I1202 23:19:57.617217 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerStarted","Data":"54ca325c8c94653ad570e77befdb8b3b9facecff7cbbb82109c808907f9ba204"} Dec 02 23:19:57 crc kubenswrapper[4696]: I1202 23:19:57.619123 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:19:58 crc kubenswrapper[4696]: I1202 23:19:58.630783 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerStarted","Data":"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956"} Dec 02 23:19:59 crc kubenswrapper[4696]: I1202 23:19:59.661067 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerID="0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956" exitCode=0 Dec 02 23:19:59 crc kubenswrapper[4696]: I1202 23:19:59.661523 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerDied","Data":"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956"} Dec 02 23:20:00 crc kubenswrapper[4696]: I1202 23:20:00.676893 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerStarted","Data":"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca"} Dec 02 23:20:00 crc kubenswrapper[4696]: I1202 23:20:00.710768 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5s5p4" podStartSLOduration=2.199776445 podStartE2EDuration="4.710720229s" podCreationTimestamp="2025-12-02 23:19:56 +0000 UTC" firstStartedPulling="2025-12-02 23:19:57.618830619 +0000 UTC m=+2260.499510620" lastFinishedPulling="2025-12-02 23:20:00.129774393 +0000 UTC m=+2263.010454404" observedRunningTime="2025-12-02 23:20:00.703317229 +0000 UTC m=+2263.583997240" watchObservedRunningTime="2025-12-02 23:20:00.710720229 +0000 UTC m=+2263.591400240" Dec 02 23:20:06 crc kubenswrapper[4696]: I1202 23:20:06.612044 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:06 crc kubenswrapper[4696]: I1202 23:20:06.612758 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:06 crc kubenswrapper[4696]: I1202 23:20:06.699672 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:06 crc kubenswrapper[4696]: I1202 23:20:06.798444 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:06 crc kubenswrapper[4696]: I1202 23:20:06.947919 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:20:08 crc kubenswrapper[4696]: I1202 23:20:08.768192 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5s5p4" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="registry-server" containerID="cri-o://d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca" gracePeriod=2 Dec 02 23:20:09 crc kubenswrapper[4696]: E1202 23:20:09.408882 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1db5ac_d24b_4c89_9463_bdd62b70f629.slice/crio-conmon-d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.759254 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.779014 4696 generic.go:334] "Generic (PLEG): container finished" podID="81a46783-f2f6-464b-a1cd-d859d59e0c99" containerID="5cd2d4fce9aadcdbfe9ce0f0506f7138aaecb095c45b3849f401ba5cd7c74bc6" exitCode=0 Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.779107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" event={"ID":"81a46783-f2f6-464b-a1cd-d859d59e0c99","Type":"ContainerDied","Data":"5cd2d4fce9aadcdbfe9ce0f0506f7138aaecb095c45b3849f401ba5cd7c74bc6"} Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.784791 4696 generic.go:334] "Generic (PLEG): container finished" podID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerID="d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca" exitCode=0 Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.784874 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerDied","Data":"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca"} Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.784939 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s5p4" event={"ID":"3d1db5ac-d24b-4c89-9463-bdd62b70f629","Type":"ContainerDied","Data":"54ca325c8c94653ad570e77befdb8b3b9facecff7cbbb82109c808907f9ba204"} Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.784962 4696 scope.go:117] "RemoveContainer" containerID="d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.785046 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s5p4" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.818206 4696 scope.go:117] "RemoveContainer" containerID="0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.846102 4696 scope.go:117] "RemoveContainer" containerID="aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.859286 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities\") pod \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.859386 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srmdd\" (UniqueName: \"kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd\") pod \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.859446 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content\") pod \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\" (UID: \"3d1db5ac-d24b-4c89-9463-bdd62b70f629\") " Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.860494 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities" (OuterVolumeSpecName: "utilities") pod "3d1db5ac-d24b-4c89-9463-bdd62b70f629" (UID: "3d1db5ac-d24b-4c89-9463-bdd62b70f629"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.866497 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd" (OuterVolumeSpecName: "kube-api-access-srmdd") pod "3d1db5ac-d24b-4c89-9463-bdd62b70f629" (UID: "3d1db5ac-d24b-4c89-9463-bdd62b70f629"). InnerVolumeSpecName "kube-api-access-srmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.912716 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d1db5ac-d24b-4c89-9463-bdd62b70f629" (UID: "3d1db5ac-d24b-4c89-9463-bdd62b70f629"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.938118 4696 scope.go:117] "RemoveContainer" containerID="d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca" Dec 02 23:20:09 crc kubenswrapper[4696]: E1202 23:20:09.938980 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca\": container with ID starting with d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca not found: ID does not exist" containerID="d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.939066 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca"} err="failed to get container status \"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca\": rpc error: code = NotFound desc = could not find container \"d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca\": container with ID starting with d6ba218bbf320a0ff14f3ea9894fc01ae3f5738cf9676c3616754c68549fa5ca not found: ID does not exist" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.939122 4696 scope.go:117] "RemoveContainer" containerID="0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956" Dec 02 23:20:09 crc kubenswrapper[4696]: E1202 23:20:09.940014 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956\": container with ID starting with 0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956 not found: ID does not exist" containerID="0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.940066 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956"} err="failed to get container status \"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956\": rpc error: code = NotFound desc = could not find container \"0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956\": container with ID starting with 0a6eaa87afd1963ebc046b3b36b182ee42fe6a84a092ca816243d68dec9e2956 not found: ID does not exist" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.940106 4696 scope.go:117] "RemoveContainer" containerID="aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8" Dec 02 23:20:09 crc kubenswrapper[4696]: E1202 23:20:09.941838 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8\": container with ID starting with aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8 not found: ID does not exist" containerID="aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.941904 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8"} err="failed to get container status \"aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8\": rpc error: code = NotFound desc = could not find container \"aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8\": container with ID starting with aa997bbaa8c96f02ebb659caf23bdf57a7cc927dcfd968790ef878c290052ef8 not found: ID does not exist" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.963027 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.963099 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srmdd\" (UniqueName: \"kubernetes.io/projected/3d1db5ac-d24b-4c89-9463-bdd62b70f629-kube-api-access-srmdd\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:09 crc kubenswrapper[4696]: I1202 23:20:09.963115 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1db5ac-d24b-4c89-9463-bdd62b70f629-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:10 crc kubenswrapper[4696]: I1202 23:20:10.134317 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:20:10 crc kubenswrapper[4696]: I1202 23:20:10.144185 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5s5p4"] Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.234912 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395217 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sztbh\" (UniqueName: \"kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395306 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395350 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395447 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395800 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.395902 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0\") pod \"81a46783-f2f6-464b-a1cd-d859d59e0c99\" (UID: \"81a46783-f2f6-464b-a1cd-d859d59e0c99\") " Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.404673 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.407046 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh" (OuterVolumeSpecName: "kube-api-access-sztbh") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "kube-api-access-sztbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.437767 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.439220 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.443598 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.448098 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" path="/var/lib/kubelet/pods/3d1db5ac-d24b-4c89-9463-bdd62b70f629/volumes" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.457683 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory" (OuterVolumeSpecName: "inventory") pod "81a46783-f2f6-464b-a1cd-d859d59e0c99" (UID: "81a46783-f2f6-464b-a1cd-d859d59e0c99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.499189 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sztbh\" (UniqueName: \"kubernetes.io/projected/81a46783-f2f6-464b-a1cd-d859d59e0c99-kube-api-access-sztbh\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.499496 4696 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.499677 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.499853 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.500170 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.500382 4696 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/81a46783-f2f6-464b-a1cd-d859d59e0c99-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.817304 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" event={"ID":"81a46783-f2f6-464b-a1cd-d859d59e0c99","Type":"ContainerDied","Data":"96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac"} Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.817379 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96306e861d0266bfe57934cdc0aa93b3655fe6fafc6ca7bc2aa31c8c82dfe6ac" Dec 02 23:20:11 crc kubenswrapper[4696]: I1202 23:20:11.817432 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.056065 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5"] Dec 02 23:20:12 crc kubenswrapper[4696]: E1202 23:20:12.057048 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="extract-utilities" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.057083 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="extract-utilities" Dec 02 23:20:12 crc kubenswrapper[4696]: E1202 23:20:12.057133 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="extract-content" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.057148 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="extract-content" Dec 02 23:20:12 crc kubenswrapper[4696]: E1202 23:20:12.057179 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="registry-server" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.057197 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="registry-server" Dec 02 23:20:12 crc kubenswrapper[4696]: E1202 23:20:12.057253 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a46783-f2f6-464b-a1cd-d859d59e0c99" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.057270 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a46783-f2f6-464b-a1cd-d859d59e0c99" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.057954 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a46783-f2f6-464b-a1cd-d859d59e0c99" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.058009 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1db5ac-d24b-4c89-9463-bdd62b70f629" containerName="registry-server" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.059347 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.064900 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.065369 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.065706 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.066819 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.068180 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.079114 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5"] Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.218321 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.218833 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.218874 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.218895 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.218919 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zc4h\" (UniqueName: \"kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.320550 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.320692 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.320723 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.320756 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.320774 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zc4h\" (UniqueName: \"kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.325514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.327076 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.327633 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.328274 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.342371 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zc4h\" (UniqueName: \"kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-526g5\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.388385 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:20:12 crc kubenswrapper[4696]: I1202 23:20:12.994178 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5"] Dec 02 23:20:13 crc kubenswrapper[4696]: I1202 23:20:13.847665 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" event={"ID":"5697ae7a-9589-4939-a3a5-5613ee6094ab","Type":"ContainerStarted","Data":"b4d854a3e9b4852642b715fdbf1188a88287d5d14e24b0af33f808efbd2dffe0"} Dec 02 23:20:13 crc kubenswrapper[4696]: I1202 23:20:13.848387 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" event={"ID":"5697ae7a-9589-4939-a3a5-5613ee6094ab","Type":"ContainerStarted","Data":"8a9e4635ab65ff2b5d001c2f12d5e75e4f5ccaf606f8192c64505c448a854b8c"} Dec 02 23:20:13 crc kubenswrapper[4696]: I1202 23:20:13.873994 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" podStartSLOduration=1.472703624 podStartE2EDuration="1.873967344s" podCreationTimestamp="2025-12-02 23:20:12 +0000 UTC" firstStartedPulling="2025-12-02 23:20:13.000971609 +0000 UTC m=+2275.881651620" lastFinishedPulling="2025-12-02 23:20:13.402235329 +0000 UTC m=+2276.282915340" observedRunningTime="2025-12-02 23:20:13.863277001 +0000 UTC m=+2276.743957022" watchObservedRunningTime="2025-12-02 23:20:13.873967344 +0000 UTC m=+2276.754647345" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.126255 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.129627 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.152825 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.175577 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.175662 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5vf4\" (UniqueName: \"kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.175713 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.278419 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.278801 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5vf4\" (UniqueName: \"kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.278949 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.279291 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.279547 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.302808 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5vf4\" (UniqueName: \"kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4\") pod \"redhat-marketplace-5l7zb\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.452036 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:18 crc kubenswrapper[4696]: I1202 23:20:18.950611 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:19 crc kubenswrapper[4696]: I1202 23:20:19.918823 4696 generic.go:334] "Generic (PLEG): container finished" podID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerID="d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca" exitCode=0 Dec 02 23:20:19 crc kubenswrapper[4696]: I1202 23:20:19.918868 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerDied","Data":"d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca"} Dec 02 23:20:19 crc kubenswrapper[4696]: I1202 23:20:19.919385 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerStarted","Data":"1f27772e6e44d4def8b57716848d2479062457fe567a838a797e7138906dbc85"} Dec 02 23:20:21 crc kubenswrapper[4696]: I1202 23:20:21.944490 4696 generic.go:334] "Generic (PLEG): container finished" podID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerID="396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a" exitCode=0 Dec 02 23:20:21 crc kubenswrapper[4696]: I1202 23:20:21.944621 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerDied","Data":"396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a"} Dec 02 23:20:22 crc kubenswrapper[4696]: I1202 23:20:22.964717 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerStarted","Data":"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd"} Dec 02 23:20:22 crc kubenswrapper[4696]: I1202 23:20:22.990954 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5l7zb" podStartSLOduration=2.506080947 podStartE2EDuration="4.990930781s" podCreationTimestamp="2025-12-02 23:20:18 +0000 UTC" firstStartedPulling="2025-12-02 23:20:19.922018392 +0000 UTC m=+2282.802698383" lastFinishedPulling="2025-12-02 23:20:22.406868186 +0000 UTC m=+2285.287548217" observedRunningTime="2025-12-02 23:20:22.988558754 +0000 UTC m=+2285.869238755" watchObservedRunningTime="2025-12-02 23:20:22.990930781 +0000 UTC m=+2285.871610792" Dec 02 23:20:28 crc kubenswrapper[4696]: I1202 23:20:28.452659 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:28 crc kubenswrapper[4696]: I1202 23:20:28.453468 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:28 crc kubenswrapper[4696]: I1202 23:20:28.511038 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:29 crc kubenswrapper[4696]: I1202 23:20:29.091780 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:29 crc kubenswrapper[4696]: I1202 23:20:29.163995 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:31 crc kubenswrapper[4696]: I1202 23:20:31.051113 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5l7zb" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="registry-server" containerID="cri-o://5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd" gracePeriod=2 Dec 02 23:20:31 crc kubenswrapper[4696]: I1202 23:20:31.922446 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.063546 4696 generic.go:334] "Generic (PLEG): container finished" podID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerID="5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd" exitCode=0 Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.063606 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerDied","Data":"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd"} Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.063626 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l7zb" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.063650 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l7zb" event={"ID":"76658e1e-2226-4d9f-87e2-ad7c08c776cf","Type":"ContainerDied","Data":"1f27772e6e44d4def8b57716848d2479062457fe567a838a797e7138906dbc85"} Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.063694 4696 scope.go:117] "RemoveContainer" containerID="5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.088571 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities\") pod \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.089077 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content\") pod \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.089202 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5vf4\" (UniqueName: \"kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4\") pod \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\" (UID: \"76658e1e-2226-4d9f-87e2-ad7c08c776cf\") " Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.090131 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities" (OuterVolumeSpecName: "utilities") pod "76658e1e-2226-4d9f-87e2-ad7c08c776cf" (UID: "76658e1e-2226-4d9f-87e2-ad7c08c776cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.101491 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4" (OuterVolumeSpecName: "kube-api-access-k5vf4") pod "76658e1e-2226-4d9f-87e2-ad7c08c776cf" (UID: "76658e1e-2226-4d9f-87e2-ad7c08c776cf"). InnerVolumeSpecName "kube-api-access-k5vf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.105782 4696 scope.go:117] "RemoveContainer" containerID="396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.123069 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76658e1e-2226-4d9f-87e2-ad7c08c776cf" (UID: "76658e1e-2226-4d9f-87e2-ad7c08c776cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.166129 4696 scope.go:117] "RemoveContainer" containerID="d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.191561 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.191621 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5vf4\" (UniqueName: \"kubernetes.io/projected/76658e1e-2226-4d9f-87e2-ad7c08c776cf-kube-api-access-k5vf4\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.191633 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76658e1e-2226-4d9f-87e2-ad7c08c776cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.219595 4696 scope.go:117] "RemoveContainer" containerID="5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd" Dec 02 23:20:32 crc kubenswrapper[4696]: E1202 23:20:32.220340 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd\": container with ID starting with 5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd not found: ID does not exist" containerID="5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.220401 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd"} err="failed to get container status \"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd\": rpc error: code = NotFound desc = could not find container \"5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd\": container with ID starting with 5dbd90aa6276aea239ec01c34b2416a3e2304a56a2efabd37749270f7dc1a2fd not found: ID does not exist" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.220437 4696 scope.go:117] "RemoveContainer" containerID="396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a" Dec 02 23:20:32 crc kubenswrapper[4696]: E1202 23:20:32.221616 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a\": container with ID starting with 396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a not found: ID does not exist" containerID="396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.221667 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a"} err="failed to get container status \"396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a\": rpc error: code = NotFound desc = could not find container \"396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a\": container with ID starting with 396189b2e3b25a4fda95dbb21a12dba9250fd8fa743e54e490d9e710af47dd4a not found: ID does not exist" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.221701 4696 scope.go:117] "RemoveContainer" containerID="d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca" Dec 02 23:20:32 crc kubenswrapper[4696]: E1202 23:20:32.222561 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca\": container with ID starting with d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca not found: ID does not exist" containerID="d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.222602 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca"} err="failed to get container status \"d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca\": rpc error: code = NotFound desc = could not find container \"d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca\": container with ID starting with d50e57caf93995966c422e0e21ceed0ced4c4a22b7301351e2c8dd76d4a9d5ca not found: ID does not exist" Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.411260 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:32 crc kubenswrapper[4696]: I1202 23:20:32.422245 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l7zb"] Dec 02 23:20:33 crc kubenswrapper[4696]: I1202 23:20:33.449175 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" path="/var/lib/kubelet/pods/76658e1e-2226-4d9f-87e2-ad7c08c776cf/volumes" Dec 02 23:21:22 crc kubenswrapper[4696]: I1202 23:21:22.974375 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:21:22 crc kubenswrapper[4696]: I1202 23:21:22.975065 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:21:52 crc kubenswrapper[4696]: I1202 23:21:52.974416 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:21:52 crc kubenswrapper[4696]: I1202 23:21:52.975396 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:22:22 crc kubenswrapper[4696]: I1202 23:22:22.973707 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:22:22 crc kubenswrapper[4696]: I1202 23:22:22.974824 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:22:22 crc kubenswrapper[4696]: I1202 23:22:22.974896 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:22:22 crc kubenswrapper[4696]: I1202 23:22:22.975985 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:22:22 crc kubenswrapper[4696]: I1202 23:22:22.976083 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" gracePeriod=600 Dec 02 23:22:23 crc kubenswrapper[4696]: E1202 23:22:23.112630 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:22:23 crc kubenswrapper[4696]: I1202 23:22:23.444951 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" exitCode=0 Dec 02 23:22:23 crc kubenswrapper[4696]: I1202 23:22:23.445066 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23"} Dec 02 23:22:23 crc kubenswrapper[4696]: I1202 23:22:23.445575 4696 scope.go:117] "RemoveContainer" containerID="174d09adecaf1c340446fa66c2f7c4a5200987c0bf9dc9ef1ddf0654d92e86ea" Dec 02 23:22:23 crc kubenswrapper[4696]: I1202 23:22:23.446253 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:22:23 crc kubenswrapper[4696]: E1202 23:22:23.446732 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:22:34 crc kubenswrapper[4696]: I1202 23:22:34.432456 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:22:34 crc kubenswrapper[4696]: E1202 23:22:34.433356 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:22:46 crc kubenswrapper[4696]: I1202 23:22:46.433251 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:22:46 crc kubenswrapper[4696]: E1202 23:22:46.434397 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:23:02 crc kubenswrapper[4696]: I1202 23:23:02.432250 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:23:02 crc kubenswrapper[4696]: E1202 23:23:02.433387 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:23:16 crc kubenswrapper[4696]: I1202 23:23:16.433652 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:23:16 crc kubenswrapper[4696]: E1202 23:23:16.435672 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:23:31 crc kubenswrapper[4696]: I1202 23:23:31.432323 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:23:31 crc kubenswrapper[4696]: E1202 23:23:31.433422 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:23:43 crc kubenswrapper[4696]: I1202 23:23:43.432080 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:23:43 crc kubenswrapper[4696]: E1202 23:23:43.433564 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:23:58 crc kubenswrapper[4696]: I1202 23:23:58.432613 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:23:58 crc kubenswrapper[4696]: E1202 23:23:58.435134 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:24:10 crc kubenswrapper[4696]: I1202 23:24:10.433282 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:24:10 crc kubenswrapper[4696]: E1202 23:24:10.434140 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:24:23 crc kubenswrapper[4696]: I1202 23:24:23.431897 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:24:23 crc kubenswrapper[4696]: E1202 23:24:23.432988 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:24:38 crc kubenswrapper[4696]: I1202 23:24:38.432790 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:24:38 crc kubenswrapper[4696]: E1202 23:24:38.434457 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:24:50 crc kubenswrapper[4696]: I1202 23:24:50.432601 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:24:50 crc kubenswrapper[4696]: E1202 23:24:50.433578 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:25:03 crc kubenswrapper[4696]: I1202 23:25:03.431922 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:25:03 crc kubenswrapper[4696]: E1202 23:25:03.433003 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:25:06 crc kubenswrapper[4696]: I1202 23:25:06.321962 4696 generic.go:334] "Generic (PLEG): container finished" podID="5697ae7a-9589-4939-a3a5-5613ee6094ab" containerID="b4d854a3e9b4852642b715fdbf1188a88287d5d14e24b0af33f808efbd2dffe0" exitCode=0 Dec 02 23:25:06 crc kubenswrapper[4696]: I1202 23:25:06.322040 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" event={"ID":"5697ae7a-9589-4939-a3a5-5613ee6094ab","Type":"ContainerDied","Data":"b4d854a3e9b4852642b715fdbf1188a88287d5d14e24b0af33f808efbd2dffe0"} Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.775256 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.937066 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle\") pod \"5697ae7a-9589-4939-a3a5-5613ee6094ab\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.937263 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0\") pod \"5697ae7a-9589-4939-a3a5-5613ee6094ab\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.937444 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zc4h\" (UniqueName: \"kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h\") pod \"5697ae7a-9589-4939-a3a5-5613ee6094ab\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.937486 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key\") pod \"5697ae7a-9589-4939-a3a5-5613ee6094ab\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.937527 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory\") pod \"5697ae7a-9589-4939-a3a5-5613ee6094ab\" (UID: \"5697ae7a-9589-4939-a3a5-5613ee6094ab\") " Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.944503 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h" (OuterVolumeSpecName: "kube-api-access-6zc4h") pod "5697ae7a-9589-4939-a3a5-5613ee6094ab" (UID: "5697ae7a-9589-4939-a3a5-5613ee6094ab"). InnerVolumeSpecName "kube-api-access-6zc4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.951286 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5697ae7a-9589-4939-a3a5-5613ee6094ab" (UID: "5697ae7a-9589-4939-a3a5-5613ee6094ab"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.980405 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5697ae7a-9589-4939-a3a5-5613ee6094ab" (UID: "5697ae7a-9589-4939-a3a5-5613ee6094ab"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.984099 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory" (OuterVolumeSpecName: "inventory") pod "5697ae7a-9589-4939-a3a5-5613ee6094ab" (UID: "5697ae7a-9589-4939-a3a5-5613ee6094ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:07 crc kubenswrapper[4696]: I1202 23:25:07.987893 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5697ae7a-9589-4939-a3a5-5613ee6094ab" (UID: "5697ae7a-9589-4939-a3a5-5613ee6094ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.041409 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zc4h\" (UniqueName: \"kubernetes.io/projected/5697ae7a-9589-4939-a3a5-5613ee6094ab-kube-api-access-6zc4h\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.041488 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.041515 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.041528 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.041615 4696 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5697ae7a-9589-4939-a3a5-5613ee6094ab-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.365024 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" event={"ID":"5697ae7a-9589-4939-a3a5-5613ee6094ab","Type":"ContainerDied","Data":"8a9e4635ab65ff2b5d001c2f12d5e75e4f5ccaf606f8192c64505c448a854b8c"} Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.365661 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a9e4635ab65ff2b5d001c2f12d5e75e4f5ccaf606f8192c64505c448a854b8c" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.365149 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-526g5" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.461623 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x"] Dec 02 23:25:08 crc kubenswrapper[4696]: E1202 23:25:08.462155 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="extract-utilities" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462177 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="extract-utilities" Dec 02 23:25:08 crc kubenswrapper[4696]: E1202 23:25:08.462188 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="registry-server" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462194 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="registry-server" Dec 02 23:25:08 crc kubenswrapper[4696]: E1202 23:25:08.462218 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5697ae7a-9589-4939-a3a5-5613ee6094ab" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462227 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="5697ae7a-9589-4939-a3a5-5613ee6094ab" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:08 crc kubenswrapper[4696]: E1202 23:25:08.462267 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="extract-content" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462275 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="extract-content" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462469 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="76658e1e-2226-4d9f-87e2-ad7c08c776cf" containerName="registry-server" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.462489 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="5697ae7a-9589-4939-a3a5-5613ee6094ab" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.463342 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.470226 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.470403 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.470682 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.470896 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.470693 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.477495 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.477570 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.496310 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x"] Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.556947 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.557396 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.557615 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9cn\" (UniqueName: \"kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.557889 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.558078 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.558313 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.558589 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.558933 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.559205 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.661619 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.661818 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.661863 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9cn\" (UniqueName: \"kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.661921 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.661966 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.662002 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.662084 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.662117 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.662201 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.663229 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.668127 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.668653 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.669144 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.669378 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.671270 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.671491 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.672432 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.685592 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9cn\" (UniqueName: \"kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2dd9x\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:08 crc kubenswrapper[4696]: I1202 23:25:08.794315 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:25:09 crc kubenswrapper[4696]: I1202 23:25:09.387101 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x"] Dec 02 23:25:09 crc kubenswrapper[4696]: I1202 23:25:09.397380 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:25:10 crc kubenswrapper[4696]: I1202 23:25:10.389219 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" event={"ID":"7b36c51f-9889-4191-a7ea-b54a79542e0b","Type":"ContainerStarted","Data":"51e8c59f33426ad7bcbe51c43e2606776746c1ffcc366aa45786aa967fb631bf"} Dec 02 23:25:10 crc kubenswrapper[4696]: I1202 23:25:10.390098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" event={"ID":"7b36c51f-9889-4191-a7ea-b54a79542e0b","Type":"ContainerStarted","Data":"5aabd8647084368f7d3f4924356dc9b014f7b4df8cab3198541f0c55d0690ab6"} Dec 02 23:25:10 crc kubenswrapper[4696]: I1202 23:25:10.413190 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" podStartSLOduration=1.8943222880000001 podStartE2EDuration="2.413169345s" podCreationTimestamp="2025-12-02 23:25:08 +0000 UTC" firstStartedPulling="2025-12-02 23:25:09.397153302 +0000 UTC m=+2572.277833303" lastFinishedPulling="2025-12-02 23:25:09.916000359 +0000 UTC m=+2572.796680360" observedRunningTime="2025-12-02 23:25:10.40980025 +0000 UTC m=+2573.290480261" watchObservedRunningTime="2025-12-02 23:25:10.413169345 +0000 UTC m=+2573.293849346" Dec 02 23:25:18 crc kubenswrapper[4696]: I1202 23:25:18.432335 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:25:18 crc kubenswrapper[4696]: E1202 23:25:18.433474 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:25:30 crc kubenswrapper[4696]: I1202 23:25:30.433143 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:25:30 crc kubenswrapper[4696]: E1202 23:25:30.434821 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:25:43 crc kubenswrapper[4696]: I1202 23:25:43.434030 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:25:43 crc kubenswrapper[4696]: E1202 23:25:43.437037 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:25:54 crc kubenswrapper[4696]: I1202 23:25:54.432388 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:25:54 crc kubenswrapper[4696]: E1202 23:25:54.433856 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:26:06 crc kubenswrapper[4696]: I1202 23:26:06.431926 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:26:06 crc kubenswrapper[4696]: E1202 23:26:06.433721 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:26:18 crc kubenswrapper[4696]: I1202 23:26:18.432111 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:26:18 crc kubenswrapper[4696]: E1202 23:26:18.433142 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:26:33 crc kubenswrapper[4696]: I1202 23:26:33.432327 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:26:33 crc kubenswrapper[4696]: E1202 23:26:33.433384 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:26:48 crc kubenswrapper[4696]: I1202 23:26:48.432813 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:26:48 crc kubenswrapper[4696]: E1202 23:26:48.434095 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:27:00 crc kubenswrapper[4696]: I1202 23:27:00.432635 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:27:00 crc kubenswrapper[4696]: E1202 23:27:00.433783 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:27:14 crc kubenswrapper[4696]: I1202 23:27:14.431759 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:27:14 crc kubenswrapper[4696]: E1202 23:27:14.432591 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:27:27 crc kubenswrapper[4696]: I1202 23:27:27.441283 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:27:28 crc kubenswrapper[4696]: I1202 23:27:28.406169 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae"} Dec 02 23:28:29 crc kubenswrapper[4696]: I1202 23:28:29.192353 4696 generic.go:334] "Generic (PLEG): container finished" podID="7b36c51f-9889-4191-a7ea-b54a79542e0b" containerID="51e8c59f33426ad7bcbe51c43e2606776746c1ffcc366aa45786aa967fb631bf" exitCode=0 Dec 02 23:28:29 crc kubenswrapper[4696]: I1202 23:28:29.192531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" event={"ID":"7b36c51f-9889-4191-a7ea-b54a79542e0b","Type":"ContainerDied","Data":"51e8c59f33426ad7bcbe51c43e2606776746c1ffcc366aa45786aa967fb631bf"} Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.759107 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.847600 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.847669 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.847710 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9cn\" (UniqueName: \"kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.847765 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.847970 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.848003 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.848021 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.848072 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.848156 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0\") pod \"7b36c51f-9889-4191-a7ea-b54a79542e0b\" (UID: \"7b36c51f-9889-4191-a7ea-b54a79542e0b\") " Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.857283 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.878901 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn" (OuterVolumeSpecName: "kube-api-access-nw9cn") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "kube-api-access-nw9cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.888168 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory" (OuterVolumeSpecName: "inventory") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.895018 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.897866 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.898456 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.898575 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.903272 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.905563 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7b36c51f-9889-4191-a7ea-b54a79542e0b" (UID: "7b36c51f-9889-4191-a7ea-b54a79542e0b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951032 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9cn\" (UniqueName: \"kubernetes.io/projected/7b36c51f-9889-4191-a7ea-b54a79542e0b-kube-api-access-nw9cn\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951087 4696 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951100 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951111 4696 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951124 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951134 4696 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951147 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951157 4696 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:30 crc kubenswrapper[4696]: I1202 23:28:30.951166 4696 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b36c51f-9889-4191-a7ea-b54a79542e0b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.218499 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" event={"ID":"7b36c51f-9889-4191-a7ea-b54a79542e0b","Type":"ContainerDied","Data":"5aabd8647084368f7d3f4924356dc9b014f7b4df8cab3198541f0c55d0690ab6"} Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.218896 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aabd8647084368f7d3f4924356dc9b014f7b4df8cab3198541f0c55d0690ab6" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.218564 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2dd9x" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.344243 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq"] Dec 02 23:28:31 crc kubenswrapper[4696]: E1202 23:28:31.345059 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b36c51f-9889-4191-a7ea-b54a79542e0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.345188 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b36c51f-9889-4191-a7ea-b54a79542e0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.345435 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b36c51f-9889-4191-a7ea-b54a79542e0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.346399 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.349109 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.349391 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.349551 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cvzmc" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.349677 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.357835 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.360266 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq"] Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463358 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463450 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463474 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463496 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463548 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463587 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.463642 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfzl\" (UniqueName: \"kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566130 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566211 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566281 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfzl\" (UniqueName: \"kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566368 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566389 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.566412 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.573756 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.573756 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.575353 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.575390 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.576802 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.576903 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.587935 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfzl\" (UniqueName: \"kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:31 crc kubenswrapper[4696]: I1202 23:28:31.668258 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:28:32 crc kubenswrapper[4696]: I1202 23:28:32.265016 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq"] Dec 02 23:28:32 crc kubenswrapper[4696]: W1202 23:28:32.275510 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c9ec356_4712_4484_9b78_9e5d4831dac1.slice/crio-8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75 WatchSource:0}: Error finding container 8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75: Status 404 returned error can't find the container with id 8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75 Dec 02 23:28:33 crc kubenswrapper[4696]: I1202 23:28:33.245090 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" event={"ID":"3c9ec356-4712-4484-9b78-9e5d4831dac1","Type":"ContainerStarted","Data":"89f070705d5b9ae666806f4c5972c77bddc06cadfbd14757677a3573b98cbc44"} Dec 02 23:28:33 crc kubenswrapper[4696]: I1202 23:28:33.245613 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" event={"ID":"3c9ec356-4712-4484-9b78-9e5d4831dac1","Type":"ContainerStarted","Data":"8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75"} Dec 02 23:28:33 crc kubenswrapper[4696]: I1202 23:28:33.273591 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" podStartSLOduration=1.839035832 podStartE2EDuration="2.273564135s" podCreationTimestamp="2025-12-02 23:28:31 +0000 UTC" firstStartedPulling="2025-12-02 23:28:32.280881946 +0000 UTC m=+2775.161561957" lastFinishedPulling="2025-12-02 23:28:32.715410219 +0000 UTC m=+2775.596090260" observedRunningTime="2025-12-02 23:28:33.264282123 +0000 UTC m=+2776.144962134" watchObservedRunningTime="2025-12-02 23:28:33.273564135 +0000 UTC m=+2776.154244136" Dec 02 23:29:52 crc kubenswrapper[4696]: I1202 23:29:52.974181 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:29:52 crc kubenswrapper[4696]: I1202 23:29:52.975042 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.166374 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl"] Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.168997 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.172258 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.172449 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.179927 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl"] Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.321050 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.321163 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.321245 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp5hf\" (UniqueName: \"kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.423175 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.423861 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.423981 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp5hf\" (UniqueName: \"kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.425645 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.432238 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.445429 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp5hf\" (UniqueName: \"kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf\") pod \"collect-profiles-29411970-d2zkl\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.508116 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:00 crc kubenswrapper[4696]: I1202 23:30:00.984222 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl"] Dec 02 23:30:01 crc kubenswrapper[4696]: I1202 23:30:01.294282 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" event={"ID":"9deb9e1b-418a-413b-8329-fbbfadd5660d","Type":"ContainerStarted","Data":"c91076e2a94cdd9aedf04eb78e7256e8304d3460a835d5d53f35d30170a06b1f"} Dec 02 23:30:01 crc kubenswrapper[4696]: I1202 23:30:01.294826 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" event={"ID":"9deb9e1b-418a-413b-8329-fbbfadd5660d","Type":"ContainerStarted","Data":"a933caaea4a76de64e7f24c24407d9a41670236cdf1a2cec0320354060deb0cb"} Dec 02 23:30:01 crc kubenswrapper[4696]: I1202 23:30:01.326883 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" podStartSLOduration=1.326857294 podStartE2EDuration="1.326857294s" podCreationTimestamp="2025-12-02 23:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:30:01.316391319 +0000 UTC m=+2864.197071330" watchObservedRunningTime="2025-12-02 23:30:01.326857294 +0000 UTC m=+2864.207537295" Dec 02 23:30:02 crc kubenswrapper[4696]: I1202 23:30:02.306020 4696 generic.go:334] "Generic (PLEG): container finished" podID="9deb9e1b-418a-413b-8329-fbbfadd5660d" containerID="c91076e2a94cdd9aedf04eb78e7256e8304d3460a835d5d53f35d30170a06b1f" exitCode=0 Dec 02 23:30:02 crc kubenswrapper[4696]: I1202 23:30:02.306081 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" event={"ID":"9deb9e1b-418a-413b-8329-fbbfadd5660d","Type":"ContainerDied","Data":"c91076e2a94cdd9aedf04eb78e7256e8304d3460a835d5d53f35d30170a06b1f"} Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.708402 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.810409 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume\") pod \"9deb9e1b-418a-413b-8329-fbbfadd5660d\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.810809 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume\") pod \"9deb9e1b-418a-413b-8329-fbbfadd5660d\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.811021 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp5hf\" (UniqueName: \"kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf\") pod \"9deb9e1b-418a-413b-8329-fbbfadd5660d\" (UID: \"9deb9e1b-418a-413b-8329-fbbfadd5660d\") " Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.811893 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume" (OuterVolumeSpecName: "config-volume") pod "9deb9e1b-418a-413b-8329-fbbfadd5660d" (UID: "9deb9e1b-418a-413b-8329-fbbfadd5660d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.821495 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9deb9e1b-418a-413b-8329-fbbfadd5660d" (UID: "9deb9e1b-418a-413b-8329-fbbfadd5660d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.821824 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf" (OuterVolumeSpecName: "kube-api-access-zp5hf") pod "9deb9e1b-418a-413b-8329-fbbfadd5660d" (UID: "9deb9e1b-418a-413b-8329-fbbfadd5660d"). InnerVolumeSpecName "kube-api-access-zp5hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.913827 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9deb9e1b-418a-413b-8329-fbbfadd5660d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.913875 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9deb9e1b-418a-413b-8329-fbbfadd5660d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:03 crc kubenswrapper[4696]: I1202 23:30:03.913895 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp5hf\" (UniqueName: \"kubernetes.io/projected/9deb9e1b-418a-413b-8329-fbbfadd5660d-kube-api-access-zp5hf\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:04 crc kubenswrapper[4696]: I1202 23:30:04.327921 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" event={"ID":"9deb9e1b-418a-413b-8329-fbbfadd5660d","Type":"ContainerDied","Data":"a933caaea4a76de64e7f24c24407d9a41670236cdf1a2cec0320354060deb0cb"} Dec 02 23:30:04 crc kubenswrapper[4696]: I1202 23:30:04.328413 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a933caaea4a76de64e7f24c24407d9a41670236cdf1a2cec0320354060deb0cb" Dec 02 23:30:04 crc kubenswrapper[4696]: I1202 23:30:04.328196 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl" Dec 02 23:30:04 crc kubenswrapper[4696]: I1202 23:30:04.393003 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7"] Dec 02 23:30:04 crc kubenswrapper[4696]: I1202 23:30:04.404564 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-w4hb7"] Dec 02 23:30:05 crc kubenswrapper[4696]: I1202 23:30:05.449810 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60748893-12de-4ee9-9a98-d2e6117d2247" path="/var/lib/kubelet/pods/60748893-12de-4ee9-9a98-d2e6117d2247/volumes" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.544441 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:14 crc kubenswrapper[4696]: E1202 23:30:14.546274 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9deb9e1b-418a-413b-8329-fbbfadd5660d" containerName="collect-profiles" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.546302 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9deb9e1b-418a-413b-8329-fbbfadd5660d" containerName="collect-profiles" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.547833 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9deb9e1b-418a-413b-8329-fbbfadd5660d" containerName="collect-profiles" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.571772 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.626258 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.696099 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.696301 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnt2\" (UniqueName: \"kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.696378 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.798483 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.798724 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnt2\" (UniqueName: \"kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.798825 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.799217 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.799503 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.822975 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnt2\" (UniqueName: \"kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2\") pod \"community-operators-hv9kv\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:14 crc kubenswrapper[4696]: I1202 23:30:14.917194 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:15 crc kubenswrapper[4696]: I1202 23:30:15.464002 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:16 crc kubenswrapper[4696]: I1202 23:30:16.471450 4696 generic.go:334] "Generic (PLEG): container finished" podID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerID="5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482" exitCode=0 Dec 02 23:30:16 crc kubenswrapper[4696]: I1202 23:30:16.471598 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerDied","Data":"5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482"} Dec 02 23:30:16 crc kubenswrapper[4696]: I1202 23:30:16.472159 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerStarted","Data":"4a070dce8d8a62a215f412d68ab7d436d3fa6f0e92187b5dbf9669628af072ef"} Dec 02 23:30:16 crc kubenswrapper[4696]: I1202 23:30:16.477177 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:30:17 crc kubenswrapper[4696]: I1202 23:30:17.492042 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerStarted","Data":"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5"} Dec 02 23:30:18 crc kubenswrapper[4696]: I1202 23:30:18.511983 4696 generic.go:334] "Generic (PLEG): container finished" podID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerID="2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5" exitCode=0 Dec 02 23:30:18 crc kubenswrapper[4696]: I1202 23:30:18.512126 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerDied","Data":"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5"} Dec 02 23:30:19 crc kubenswrapper[4696]: I1202 23:30:19.525329 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerStarted","Data":"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972"} Dec 02 23:30:19 crc kubenswrapper[4696]: I1202 23:30:19.562621 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv9kv" podStartSLOduration=2.876062988 podStartE2EDuration="5.562593478s" podCreationTimestamp="2025-12-02 23:30:14 +0000 UTC" firstStartedPulling="2025-12-02 23:30:16.475797229 +0000 UTC m=+2879.356477250" lastFinishedPulling="2025-12-02 23:30:19.162327749 +0000 UTC m=+2882.043007740" observedRunningTime="2025-12-02 23:30:19.552441642 +0000 UTC m=+2882.433121693" watchObservedRunningTime="2025-12-02 23:30:19.562593478 +0000 UTC m=+2882.443273489" Dec 02 23:30:22 crc kubenswrapper[4696]: I1202 23:30:22.973869 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:30:22 crc kubenswrapper[4696]: I1202 23:30:22.974858 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:30:24 crc kubenswrapper[4696]: I1202 23:30:24.917967 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:24 crc kubenswrapper[4696]: I1202 23:30:24.918038 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:25 crc kubenswrapper[4696]: I1202 23:30:25.000083 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:25 crc kubenswrapper[4696]: I1202 23:30:25.660718 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:25 crc kubenswrapper[4696]: I1202 23:30:25.727964 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:27 crc kubenswrapper[4696]: I1202 23:30:27.627248 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv9kv" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="registry-server" containerID="cri-o://f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972" gracePeriod=2 Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.114979 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.232793 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content\") pod \"51588196-0173-4037-b165-e1d9e8d2ae2d\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.232877 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnt2\" (UniqueName: \"kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2\") pod \"51588196-0173-4037-b165-e1d9e8d2ae2d\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.232973 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities\") pod \"51588196-0173-4037-b165-e1d9e8d2ae2d\" (UID: \"51588196-0173-4037-b165-e1d9e8d2ae2d\") " Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.234049 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities" (OuterVolumeSpecName: "utilities") pod "51588196-0173-4037-b165-e1d9e8d2ae2d" (UID: "51588196-0173-4037-b165-e1d9e8d2ae2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.241578 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2" (OuterVolumeSpecName: "kube-api-access-5jnt2") pod "51588196-0173-4037-b165-e1d9e8d2ae2d" (UID: "51588196-0173-4037-b165-e1d9e8d2ae2d"). InnerVolumeSpecName "kube-api-access-5jnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.305622 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51588196-0173-4037-b165-e1d9e8d2ae2d" (UID: "51588196-0173-4037-b165-e1d9e8d2ae2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.336383 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.336426 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnt2\" (UniqueName: \"kubernetes.io/projected/51588196-0173-4037-b165-e1d9e8d2ae2d-kube-api-access-5jnt2\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.336443 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51588196-0173-4037-b165-e1d9e8d2ae2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.643991 4696 generic.go:334] "Generic (PLEG): container finished" podID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerID="f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972" exitCode=0 Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.644114 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv9kv" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.644134 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerDied","Data":"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972"} Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.644760 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv9kv" event={"ID":"51588196-0173-4037-b165-e1d9e8d2ae2d","Type":"ContainerDied","Data":"4a070dce8d8a62a215f412d68ab7d436d3fa6f0e92187b5dbf9669628af072ef"} Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.644790 4696 scope.go:117] "RemoveContainer" containerID="f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.680310 4696 scope.go:117] "RemoveContainer" containerID="2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.701388 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.710932 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv9kv"] Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.720677 4696 scope.go:117] "RemoveContainer" containerID="5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.772525 4696 scope.go:117] "RemoveContainer" containerID="f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972" Dec 02 23:30:28 crc kubenswrapper[4696]: E1202 23:30:28.773153 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972\": container with ID starting with f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972 not found: ID does not exist" containerID="f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.773282 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972"} err="failed to get container status \"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972\": rpc error: code = NotFound desc = could not find container \"f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972\": container with ID starting with f01d77489e61accb36e40e7a154d5dabca59491ba2b38d2eb50c9ecb462e7972 not found: ID does not exist" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.773371 4696 scope.go:117] "RemoveContainer" containerID="2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5" Dec 02 23:30:28 crc kubenswrapper[4696]: E1202 23:30:28.773933 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5\": container with ID starting with 2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5 not found: ID does not exist" containerID="2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.774015 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5"} err="failed to get container status \"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5\": rpc error: code = NotFound desc = could not find container \"2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5\": container with ID starting with 2f7300d32ed79d7dfb42c987a642c425fe038e8ac04ba4e6268b46406c140bf5 not found: ID does not exist" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.774082 4696 scope.go:117] "RemoveContainer" containerID="5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482" Dec 02 23:30:28 crc kubenswrapper[4696]: E1202 23:30:28.774364 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482\": container with ID starting with 5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482 not found: ID does not exist" containerID="5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482" Dec 02 23:30:28 crc kubenswrapper[4696]: I1202 23:30:28.774438 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482"} err="failed to get container status \"5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482\": rpc error: code = NotFound desc = could not find container \"5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482\": container with ID starting with 5b1ba0155e3f7ff69cb759991754127531218459ffec5d330233161f7790d482 not found: ID does not exist" Dec 02 23:30:29 crc kubenswrapper[4696]: I1202 23:30:29.448021 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" path="/var/lib/kubelet/pods/51588196-0173-4037-b165-e1d9e8d2ae2d/volumes" Dec 02 23:30:37 crc kubenswrapper[4696]: I1202 23:30:37.449360 4696 scope.go:117] "RemoveContainer" containerID="2a19bd054b1f9dbfbbbdb68ed5d00c5dc33b522a8f762339366f107d93efaf66" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.533386 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:30:48 crc kubenswrapper[4696]: E1202 23:30:48.534548 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="registry-server" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.534567 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="registry-server" Dec 02 23:30:48 crc kubenswrapper[4696]: E1202 23:30:48.534582 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="extract-utilities" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.534591 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="extract-utilities" Dec 02 23:30:48 crc kubenswrapper[4696]: E1202 23:30:48.534613 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="extract-content" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.534622 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="extract-content" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.534933 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="51588196-0173-4037-b165-e1d9e8d2ae2d" containerName="registry-server" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.536780 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.557541 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.737166 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.737285 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.737334 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bq42\" (UniqueName: \"kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.839541 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.839673 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.839727 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bq42\" (UniqueName: \"kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.840406 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.840566 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.866285 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bq42\" (UniqueName: \"kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42\") pod \"certified-operators-g8st8\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:48 crc kubenswrapper[4696]: I1202 23:30:48.889705 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:49 crc kubenswrapper[4696]: I1202 23:30:49.403592 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:30:49 crc kubenswrapper[4696]: I1202 23:30:49.918713 4696 generic.go:334] "Generic (PLEG): container finished" podID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerID="794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc" exitCode=0 Dec 02 23:30:49 crc kubenswrapper[4696]: I1202 23:30:49.918856 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerDied","Data":"794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc"} Dec 02 23:30:49 crc kubenswrapper[4696]: I1202 23:30:49.919301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerStarted","Data":"931aebcfaf6b5d29fd0e3b9bc4bb7f1910f56defdf304d33f769e169dbf0b2df"} Dec 02 23:30:50 crc kubenswrapper[4696]: I1202 23:30:50.934389 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerStarted","Data":"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7"} Dec 02 23:30:51 crc kubenswrapper[4696]: I1202 23:30:51.952908 4696 generic.go:334] "Generic (PLEG): container finished" podID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerID="9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7" exitCode=0 Dec 02 23:30:51 crc kubenswrapper[4696]: I1202 23:30:51.952969 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerDied","Data":"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7"} Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.969192 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerStarted","Data":"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c"} Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.973369 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.973433 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.973477 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.974431 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.974507 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae" gracePeriod=600 Dec 02 23:30:52 crc kubenswrapper[4696]: I1202 23:30:52.994334 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8st8" podStartSLOduration=2.497993808 podStartE2EDuration="4.994313174s" podCreationTimestamp="2025-12-02 23:30:48 +0000 UTC" firstStartedPulling="2025-12-02 23:30:49.921394726 +0000 UTC m=+2912.802074737" lastFinishedPulling="2025-12-02 23:30:52.417714102 +0000 UTC m=+2915.298394103" observedRunningTime="2025-12-02 23:30:52.98921003 +0000 UTC m=+2915.869890031" watchObservedRunningTime="2025-12-02 23:30:52.994313174 +0000 UTC m=+2915.874993175" Dec 02 23:30:53 crc kubenswrapper[4696]: I1202 23:30:53.986389 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae" exitCode=0 Dec 02 23:30:53 crc kubenswrapper[4696]: I1202 23:30:53.986501 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae"} Dec 02 23:30:53 crc kubenswrapper[4696]: I1202 23:30:53.987423 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168"} Dec 02 23:30:53 crc kubenswrapper[4696]: I1202 23:30:53.987458 4696 scope.go:117] "RemoveContainer" containerID="c60305cde64a9129d2cf1a8f8c830e468e22e0b6c2bcea72c16bb3a3f13d7f23" Dec 02 23:30:58 crc kubenswrapper[4696]: I1202 23:30:58.890560 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:58 crc kubenswrapper[4696]: I1202 23:30:58.891429 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:58 crc kubenswrapper[4696]: I1202 23:30:58.949650 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:59 crc kubenswrapper[4696]: I1202 23:30:59.094875 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:30:59 crc kubenswrapper[4696]: I1202 23:30:59.201000 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.075395 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g8st8" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="registry-server" containerID="cri-o://e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c" gracePeriod=2 Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.588940 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.767423 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities\") pod \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.767791 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content\") pod \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.768126 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bq42\" (UniqueName: \"kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42\") pod \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\" (UID: \"7d89b498-2250-401d-8e84-96f8c4ebfd2a\") " Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.769367 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities" (OuterVolumeSpecName: "utilities") pod "7d89b498-2250-401d-8e84-96f8c4ebfd2a" (UID: "7d89b498-2250-401d-8e84-96f8c4ebfd2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.775840 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42" (OuterVolumeSpecName: "kube-api-access-4bq42") pod "7d89b498-2250-401d-8e84-96f8c4ebfd2a" (UID: "7d89b498-2250-401d-8e84-96f8c4ebfd2a"). InnerVolumeSpecName "kube-api-access-4bq42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.828058 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d89b498-2250-401d-8e84-96f8c4ebfd2a" (UID: "7d89b498-2250-401d-8e84-96f8c4ebfd2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.871503 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bq42\" (UniqueName: \"kubernetes.io/projected/7d89b498-2250-401d-8e84-96f8c4ebfd2a-kube-api-access-4bq42\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.871551 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:01 crc kubenswrapper[4696]: I1202 23:31:01.871566 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89b498-2250-401d-8e84-96f8c4ebfd2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.090055 4696 generic.go:334] "Generic (PLEG): container finished" podID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerID="e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c" exitCode=0 Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.090538 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8st8" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.090436 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerDied","Data":"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c"} Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.090693 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8st8" event={"ID":"7d89b498-2250-401d-8e84-96f8c4ebfd2a","Type":"ContainerDied","Data":"931aebcfaf6b5d29fd0e3b9bc4bb7f1910f56defdf304d33f769e169dbf0b2df"} Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.090720 4696 scope.go:117] "RemoveContainer" containerID="e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.132313 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.135295 4696 scope.go:117] "RemoveContainer" containerID="9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.142326 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g8st8"] Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.179334 4696 scope.go:117] "RemoveContainer" containerID="794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.247105 4696 scope.go:117] "RemoveContainer" containerID="e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c" Dec 02 23:31:02 crc kubenswrapper[4696]: E1202 23:31:02.248341 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c\": container with ID starting with e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c not found: ID does not exist" containerID="e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.248570 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c"} err="failed to get container status \"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c\": rpc error: code = NotFound desc = could not find container \"e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c\": container with ID starting with e3ce387fee83d9cc01e25f080f86081a345790c1d7e1c13173f08c5f3be2027c not found: ID does not exist" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.248623 4696 scope.go:117] "RemoveContainer" containerID="9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7" Dec 02 23:31:02 crc kubenswrapper[4696]: E1202 23:31:02.249408 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7\": container with ID starting with 9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7 not found: ID does not exist" containerID="9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.249461 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7"} err="failed to get container status \"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7\": rpc error: code = NotFound desc = could not find container \"9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7\": container with ID starting with 9a3918fbd39d6827496fdf65d331a3fb5dabccc10a51bba3b4513d5cc3acf8f7 not found: ID does not exist" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.249501 4696 scope.go:117] "RemoveContainer" containerID="794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc" Dec 02 23:31:02 crc kubenswrapper[4696]: E1202 23:31:02.252031 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc\": container with ID starting with 794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc not found: ID does not exist" containerID="794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc" Dec 02 23:31:02 crc kubenswrapper[4696]: I1202 23:31:02.252083 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc"} err="failed to get container status \"794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc\": rpc error: code = NotFound desc = could not find container \"794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc\": container with ID starting with 794aa6fbc9f578b053dc9c75bfd8884ea5d7b593645d1c0ce0cb8ffaebe038bc not found: ID does not exist" Dec 02 23:31:03 crc kubenswrapper[4696]: I1202 23:31:03.453274 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" path="/var/lib/kubelet/pods/7d89b498-2250-401d-8e84-96f8c4ebfd2a/volumes" Dec 02 23:31:04 crc kubenswrapper[4696]: I1202 23:31:04.122019 4696 generic.go:334] "Generic (PLEG): container finished" podID="3c9ec356-4712-4484-9b78-9e5d4831dac1" containerID="89f070705d5b9ae666806f4c5972c77bddc06cadfbd14757677a3573b98cbc44" exitCode=0 Dec 02 23:31:04 crc kubenswrapper[4696]: I1202 23:31:04.122144 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" event={"ID":"3c9ec356-4712-4484-9b78-9e5d4831dac1","Type":"ContainerDied","Data":"89f070705d5b9ae666806f4c5972c77bddc06cadfbd14757677a3573b98cbc44"} Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.627024 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664006 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhfzl\" (UniqueName: \"kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664151 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664204 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664354 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664572 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.664649 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.672205 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.689253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl" (OuterVolumeSpecName: "kube-api-access-qhfzl") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "kube-api-access-qhfzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.710335 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory" (OuterVolumeSpecName: "inventory") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.712203 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.714020 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.717116 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.766424 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key\") pod \"3c9ec356-4712-4484-9b78-9e5d4831dac1\" (UID: \"3c9ec356-4712-4484-9b78-9e5d4831dac1\") " Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767678 4696 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767709 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767726 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhfzl\" (UniqueName: \"kubernetes.io/projected/3c9ec356-4712-4484-9b78-9e5d4831dac1-kube-api-access-qhfzl\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767765 4696 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767784 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.767796 4696 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.799237 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c9ec356-4712-4484-9b78-9e5d4831dac1" (UID: "3c9ec356-4712-4484-9b78-9e5d4831dac1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:05 crc kubenswrapper[4696]: I1202 23:31:05.870046 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c9ec356-4712-4484-9b78-9e5d4831dac1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:06 crc kubenswrapper[4696]: I1202 23:31:06.149858 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" event={"ID":"3c9ec356-4712-4484-9b78-9e5d4831dac1","Type":"ContainerDied","Data":"8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75"} Dec 02 23:31:06 crc kubenswrapper[4696]: I1202 23:31:06.149915 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8242ab860b09deaa7b43d5879e0b5c8ef3564aacdb070d1f3194a91c687bab75" Dec 02 23:31:06 crc kubenswrapper[4696]: I1202 23:31:06.150016 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.866928 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:12 crc kubenswrapper[4696]: E1202 23:31:12.868390 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c9ec356-4712-4484-9b78-9e5d4831dac1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868408 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c9ec356-4712-4484-9b78-9e5d4831dac1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:12 crc kubenswrapper[4696]: E1202 23:31:12.868429 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="extract-content" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868438 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="extract-content" Dec 02 23:31:12 crc kubenswrapper[4696]: E1202 23:31:12.868482 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="extract-utilities" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868489 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="extract-utilities" Dec 02 23:31:12 crc kubenswrapper[4696]: E1202 23:31:12.868503 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="registry-server" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868508 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="registry-server" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868770 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d89b498-2250-401d-8e84-96f8c4ebfd2a" containerName="registry-server" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.868792 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c9ec356-4712-4484-9b78-9e5d4831dac1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.870521 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:12 crc kubenswrapper[4696]: I1202 23:31:12.888897 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.049104 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqjjl\" (UniqueName: \"kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.049183 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.049295 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.080390 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.083959 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.096049 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.151732 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.151927 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqjjl\" (UniqueName: \"kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.151974 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.152516 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.152617 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.178810 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqjjl\" (UniqueName: \"kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl\") pod \"redhat-marketplace-d6flt\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.220423 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.255267 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w8kr\" (UniqueName: \"kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.255592 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.255732 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.357596 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.358026 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w8kr\" (UniqueName: \"kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.358070 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.358611 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.358852 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.391572 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w8kr\" (UniqueName: \"kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr\") pod \"redhat-operators-glhqg\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.426189 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.779271 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:13 crc kubenswrapper[4696]: I1202 23:31:13.994174 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:14 crc kubenswrapper[4696]: W1202 23:31:14.064956 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fd5143_e4df_4530_a108_c612a47aa916.slice/crio-576a0e9196b022712706762baaa3becadbfbbacfa225fa28c7d5534d4cbb916c WatchSource:0}: Error finding container 576a0e9196b022712706762baaa3becadbfbbacfa225fa28c7d5534d4cbb916c: Status 404 returned error can't find the container with id 576a0e9196b022712706762baaa3becadbfbbacfa225fa28c7d5534d4cbb916c Dec 02 23:31:14 crc kubenswrapper[4696]: I1202 23:31:14.250515 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerStarted","Data":"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620"} Dec 02 23:31:14 crc kubenswrapper[4696]: I1202 23:31:14.251097 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerStarted","Data":"576a0e9196b022712706762baaa3becadbfbbacfa225fa28c7d5534d4cbb916c"} Dec 02 23:31:14 crc kubenswrapper[4696]: I1202 23:31:14.260723 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerID="86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741" exitCode=0 Dec 02 23:31:14 crc kubenswrapper[4696]: I1202 23:31:14.260815 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerDied","Data":"86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741"} Dec 02 23:31:14 crc kubenswrapper[4696]: I1202 23:31:14.260897 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerStarted","Data":"bdf0b79294bbd35f6dd3895ba3bf57b7d74360dbad9283ab81b7f14c84c751cc"} Dec 02 23:31:15 crc kubenswrapper[4696]: I1202 23:31:15.278332 4696 generic.go:334] "Generic (PLEG): container finished" podID="a9fd5143-e4df-4530-a108-c612a47aa916" containerID="8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620" exitCode=0 Dec 02 23:31:15 crc kubenswrapper[4696]: I1202 23:31:15.278422 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerDied","Data":"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620"} Dec 02 23:31:15 crc kubenswrapper[4696]: I1202 23:31:15.296323 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerStarted","Data":"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10"} Dec 02 23:31:16 crc kubenswrapper[4696]: I1202 23:31:16.311571 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerStarted","Data":"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad"} Dec 02 23:31:16 crc kubenswrapper[4696]: I1202 23:31:16.316349 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerID="1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10" exitCode=0 Dec 02 23:31:16 crc kubenswrapper[4696]: I1202 23:31:16.316406 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerDied","Data":"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10"} Dec 02 23:31:18 crc kubenswrapper[4696]: I1202 23:31:18.343867 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerStarted","Data":"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2"} Dec 02 23:31:18 crc kubenswrapper[4696]: I1202 23:31:18.378314 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d6flt" podStartSLOduration=3.166686076 podStartE2EDuration="6.378288155s" podCreationTimestamp="2025-12-02 23:31:12 +0000 UTC" firstStartedPulling="2025-12-02 23:31:14.267539226 +0000 UTC m=+2937.148219227" lastFinishedPulling="2025-12-02 23:31:17.479141305 +0000 UTC m=+2940.359821306" observedRunningTime="2025-12-02 23:31:18.368123028 +0000 UTC m=+2941.248803029" watchObservedRunningTime="2025-12-02 23:31:18.378288155 +0000 UTC m=+2941.258968156" Dec 02 23:31:19 crc kubenswrapper[4696]: I1202 23:31:19.356217 4696 generic.go:334] "Generic (PLEG): container finished" podID="a9fd5143-e4df-4530-a108-c612a47aa916" containerID="9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad" exitCode=0 Dec 02 23:31:19 crc kubenswrapper[4696]: I1202 23:31:19.356944 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerDied","Data":"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad"} Dec 02 23:31:21 crc kubenswrapper[4696]: I1202 23:31:21.384859 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerStarted","Data":"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0"} Dec 02 23:31:21 crc kubenswrapper[4696]: I1202 23:31:21.418709 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-glhqg" podStartSLOduration=3.489900975 podStartE2EDuration="8.418685405s" podCreationTimestamp="2025-12-02 23:31:13 +0000 UTC" firstStartedPulling="2025-12-02 23:31:15.28352056 +0000 UTC m=+2938.164200591" lastFinishedPulling="2025-12-02 23:31:20.21230502 +0000 UTC m=+2943.092985021" observedRunningTime="2025-12-02 23:31:21.407316414 +0000 UTC m=+2944.287996455" watchObservedRunningTime="2025-12-02 23:31:21.418685405 +0000 UTC m=+2944.299365406" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.221299 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.221791 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.296297 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.426996 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.427061 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:23 crc kubenswrapper[4696]: I1202 23:31:23.455532 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:24 crc kubenswrapper[4696]: I1202 23:31:24.491027 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-glhqg" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="registry-server" probeResult="failure" output=< Dec 02 23:31:24 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 23:31:24 crc kubenswrapper[4696]: > Dec 02 23:31:24 crc kubenswrapper[4696]: I1202 23:31:24.655324 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:25 crc kubenswrapper[4696]: I1202 23:31:25.440116 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d6flt" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="registry-server" containerID="cri-o://7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2" gracePeriod=2 Dec 02 23:31:25 crc kubenswrapper[4696]: I1202 23:31:25.966458 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.087755 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content\") pod \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.088353 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities\") pod \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.088466 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqjjl\" (UniqueName: \"kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl\") pod \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\" (UID: \"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd\") " Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.089286 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities" (OuterVolumeSpecName: "utilities") pod "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" (UID: "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.099899 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl" (OuterVolumeSpecName: "kube-api-access-xqjjl") pod "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" (UID: "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd"). InnerVolumeSpecName "kube-api-access-xqjjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.112528 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" (UID: "b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.191355 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqjjl\" (UniqueName: \"kubernetes.io/projected/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-kube-api-access-xqjjl\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.191414 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.191433 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.455719 4696 generic.go:334] "Generic (PLEG): container finished" podID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerID="7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2" exitCode=0 Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.455873 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerDied","Data":"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2"} Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.455928 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6flt" event={"ID":"b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd","Type":"ContainerDied","Data":"bdf0b79294bbd35f6dd3895ba3bf57b7d74360dbad9283ab81b7f14c84c751cc"} Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.455957 4696 scope.go:117] "RemoveContainer" containerID="7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.456187 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6flt" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.510054 4696 scope.go:117] "RemoveContainer" containerID="1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.538013 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.547049 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6flt"] Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.547443 4696 scope.go:117] "RemoveContainer" containerID="86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.621647 4696 scope.go:117] "RemoveContainer" containerID="7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2" Dec 02 23:31:26 crc kubenswrapper[4696]: E1202 23:31:26.622483 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2\": container with ID starting with 7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2 not found: ID does not exist" containerID="7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.622722 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2"} err="failed to get container status \"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2\": rpc error: code = NotFound desc = could not find container \"7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2\": container with ID starting with 7c69cb173ab319efc4b0495c71cea4540c2736036ca569253f270659196e27f2 not found: ID does not exist" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.622787 4696 scope.go:117] "RemoveContainer" containerID="1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10" Dec 02 23:31:26 crc kubenswrapper[4696]: E1202 23:31:26.623351 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10\": container with ID starting with 1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10 not found: ID does not exist" containerID="1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.623387 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10"} err="failed to get container status \"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10\": rpc error: code = NotFound desc = could not find container \"1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10\": container with ID starting with 1d157c22078b0a9dc5ea52ca266295b9666cfd451a37023f058a95fe64fdeb10 not found: ID does not exist" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.623410 4696 scope.go:117] "RemoveContainer" containerID="86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741" Dec 02 23:31:26 crc kubenswrapper[4696]: E1202 23:31:26.623681 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741\": container with ID starting with 86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741 not found: ID does not exist" containerID="86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741" Dec 02 23:31:26 crc kubenswrapper[4696]: I1202 23:31:26.623716 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741"} err="failed to get container status \"86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741\": rpc error: code = NotFound desc = could not find container \"86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741\": container with ID starting with 86bf00a554f0e620933c9ae3748af69b5e80c64a0c2f13866cd2aa94d0681741 not found: ID does not exist" Dec 02 23:31:27 crc kubenswrapper[4696]: I1202 23:31:27.448538 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" path="/var/lib/kubelet/pods/b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd/volumes" Dec 02 23:31:33 crc kubenswrapper[4696]: I1202 23:31:33.504081 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:33 crc kubenswrapper[4696]: I1202 23:31:33.582323 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:33 crc kubenswrapper[4696]: I1202 23:31:33.750774 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:34 crc kubenswrapper[4696]: I1202 23:31:34.769528 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-glhqg" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="registry-server" containerID="cri-o://54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0" gracePeriod=2 Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.355695 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.507924 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities\") pod \"a9fd5143-e4df-4530-a108-c612a47aa916\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.508237 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w8kr\" (UniqueName: \"kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr\") pod \"a9fd5143-e4df-4530-a108-c612a47aa916\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.508314 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content\") pod \"a9fd5143-e4df-4530-a108-c612a47aa916\" (UID: \"a9fd5143-e4df-4530-a108-c612a47aa916\") " Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.509138 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities" (OuterVolumeSpecName: "utilities") pod "a9fd5143-e4df-4530-a108-c612a47aa916" (UID: "a9fd5143-e4df-4530-a108-c612a47aa916"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.511913 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.517709 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr" (OuterVolumeSpecName: "kube-api-access-5w8kr") pod "a9fd5143-e4df-4530-a108-c612a47aa916" (UID: "a9fd5143-e4df-4530-a108-c612a47aa916"). InnerVolumeSpecName "kube-api-access-5w8kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.614624 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w8kr\" (UniqueName: \"kubernetes.io/projected/a9fd5143-e4df-4530-a108-c612a47aa916-kube-api-access-5w8kr\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.668699 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9fd5143-e4df-4530-a108-c612a47aa916" (UID: "a9fd5143-e4df-4530-a108-c612a47aa916"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.718296 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9fd5143-e4df-4530-a108-c612a47aa916-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.790494 4696 generic.go:334] "Generic (PLEG): container finished" podID="a9fd5143-e4df-4530-a108-c612a47aa916" containerID="54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0" exitCode=0 Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.790591 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerDied","Data":"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0"} Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.790674 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glhqg" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.791127 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glhqg" event={"ID":"a9fd5143-e4df-4530-a108-c612a47aa916","Type":"ContainerDied","Data":"576a0e9196b022712706762baaa3becadbfbbacfa225fa28c7d5534d4cbb916c"} Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.791151 4696 scope.go:117] "RemoveContainer" containerID="54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.829620 4696 scope.go:117] "RemoveContainer" containerID="9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.858815 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.871314 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-glhqg"] Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.873637 4696 scope.go:117] "RemoveContainer" containerID="8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.947780 4696 scope.go:117] "RemoveContainer" containerID="54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0" Dec 02 23:31:35 crc kubenswrapper[4696]: E1202 23:31:35.948544 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0\": container with ID starting with 54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0 not found: ID does not exist" containerID="54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.948617 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0"} err="failed to get container status \"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0\": rpc error: code = NotFound desc = could not find container \"54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0\": container with ID starting with 54baa4f7f861ffa010675e4029520ded45b5089823441950e75264359e1cd1c0 not found: ID does not exist" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.948658 4696 scope.go:117] "RemoveContainer" containerID="9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad" Dec 02 23:31:35 crc kubenswrapper[4696]: E1202 23:31:35.949159 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad\": container with ID starting with 9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad not found: ID does not exist" containerID="9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.949361 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad"} err="failed to get container status \"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad\": rpc error: code = NotFound desc = could not find container \"9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad\": container with ID starting with 9518e85fff79a3a52210a0e7fd6714d172c15e2271e1427dcb143d7fd0867dad not found: ID does not exist" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.949397 4696 scope.go:117] "RemoveContainer" containerID="8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620" Dec 02 23:31:35 crc kubenswrapper[4696]: E1202 23:31:35.950467 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620\": container with ID starting with 8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620 not found: ID does not exist" containerID="8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620" Dec 02 23:31:35 crc kubenswrapper[4696]: I1202 23:31:35.950575 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620"} err="failed to get container status \"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620\": rpc error: code = NotFound desc = could not find container \"8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620\": container with ID starting with 8c20e2090501b03d8a8ae3385d036746836faa18280ac1bc35c38b30cf79a620 not found: ID does not exist" Dec 02 23:31:37 crc kubenswrapper[4696]: I1202 23:31:37.456379 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" path="/var/lib/kubelet/pods/a9fd5143-e4df-4530-a108-c612a47aa916/volumes" Dec 02 23:31:43 crc kubenswrapper[4696]: I1202 23:31:43.893958 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:43 crc kubenswrapper[4696]: I1202 23:31:43.895282 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="prometheus" containerID="cri-o://e9339fa31bcb4e462050530a6dd43d8e802377053cce141770f0f2fbcfb627e3" gracePeriod=600 Dec 02 23:31:43 crc kubenswrapper[4696]: I1202 23:31:43.895903 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="config-reloader" containerID="cri-o://ddfc235b5bb0772d57e16495a8462d5cf42ca4182a8f3779f03d2787ed733fac" gracePeriod=600 Dec 02 23:31:43 crc kubenswrapper[4696]: I1202 23:31:43.895892 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="thanos-sidecar" containerID="cri-o://75db3f0dd5722131cb38a869bf56e7046b5407824c01119b9a156cd34d3b0781" gracePeriod=600 Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926211 4696 generic.go:334] "Generic (PLEG): container finished" podID="f684370c-9731-4837-9e30-675a1f07992d" containerID="75db3f0dd5722131cb38a869bf56e7046b5407824c01119b9a156cd34d3b0781" exitCode=0 Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926642 4696 generic.go:334] "Generic (PLEG): container finished" podID="f684370c-9731-4837-9e30-675a1f07992d" containerID="ddfc235b5bb0772d57e16495a8462d5cf42ca4182a8f3779f03d2787ed733fac" exitCode=0 Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926658 4696 generic.go:334] "Generic (PLEG): container finished" podID="f684370c-9731-4837-9e30-675a1f07992d" containerID="e9339fa31bcb4e462050530a6dd43d8e802377053cce141770f0f2fbcfb627e3" exitCode=0 Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926688 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerDied","Data":"75db3f0dd5722131cb38a869bf56e7046b5407824c01119b9a156cd34d3b0781"} Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926724 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerDied","Data":"ddfc235b5bb0772d57e16495a8462d5cf42ca4182a8f3779f03d2787ed733fac"} Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926757 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerDied","Data":"e9339fa31bcb4e462050530a6dd43d8e802377053cce141770f0f2fbcfb627e3"} Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f684370c-9731-4837-9e30-675a1f07992d","Type":"ContainerDied","Data":"099099debcd5a303a0f4db0ae5c135cd96e22da621e12a050538b8ee97ebaa9a"} Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.926790 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099099debcd5a303a0f4db0ae5c135cd96e22da621e12a050538b8ee97ebaa9a" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.958234 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.972507 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.972584 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.973404 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.973711 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.973832 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.973928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.973972 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.974064 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tnnv\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.974171 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.974963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.975076 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.975186 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle\") pod \"f684370c-9731-4837-9e30-675a1f07992d\" (UID: \"f684370c-9731-4837-9e30-675a1f07992d\") " Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.977299 4696 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f684370c-9731-4837-9e30-675a1f07992d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.980391 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv" (OuterVolumeSpecName: "kube-api-access-2tnnv") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "kube-api-access-2tnnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.981035 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.984194 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.987721 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config" (OuterVolumeSpecName: "config") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.988623 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.991196 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.992455 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:44 crc kubenswrapper[4696]: I1202 23:31:44.992558 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out" (OuterVolumeSpecName: "config-out") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.048132 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "pvc-ab2c851f-258a-4469-a351-d04930617bdc". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080439 4696 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080470 4696 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f684370c-9731-4837-9e30-675a1f07992d-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080483 4696 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080525 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") on node \"crc\" " Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080538 4696 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080552 4696 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080565 4696 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080575 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tnnv\" (UniqueName: \"kubernetes.io/projected/f684370c-9731-4837-9e30-675a1f07992d-kube-api-access-2tnnv\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.080585 4696 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.110816 4696 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.111045 4696 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ab2c851f-258a-4469-a351-d04930617bdc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc") on node "crc" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.121387 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config" (OuterVolumeSpecName: "web-config") pod "f684370c-9731-4837-9e30-675a1f07992d" (UID: "f684370c-9731-4837-9e30-675a1f07992d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.199678 4696 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f684370c-9731-4837-9e30-675a1f07992d-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.199729 4696 reconciler_common.go:293] "Volume detached for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.942460 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:45 crc kubenswrapper[4696]: I1202 23:31:45.988676 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.004188 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029024 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029495 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="thanos-sidecar" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029520 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="thanos-sidecar" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029538 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="config-reloader" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029546 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="config-reloader" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029561 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="extract-content" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029567 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="extract-content" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029587 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="extract-utilities" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029593 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="extract-utilities" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029606 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="prometheus" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029612 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="prometheus" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029624 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029629 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029642 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="init-config-reloader" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029650 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="init-config-reloader" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029664 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="extract-utilities" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029671 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="extract-utilities" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029684 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029690 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: E1202 23:31:46.029705 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="extract-content" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029711 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="extract-content" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029923 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="prometheus" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029945 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fd5143-e4df-4530-a108-c612a47aa916" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029953 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="thanos-sidecar" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029966 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1bdbe36-5c1a-4b38-9f78-8a8b9d12affd" containerName="registry-server" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.029976 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f684370c-9731-4837-9e30-675a1f07992d" containerName="config-reloader" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.035367 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.037887 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.038666 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.040184 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.040183 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.040356 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2dcqn" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.058501 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.072486 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121288 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121358 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121664 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef5ae851-44bb-46fa-9245-abc5b46b1771-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121710 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121734 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121810 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121854 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ef5ae851-44bb-46fa-9245-abc5b46b1771-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121881 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjzp\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-kube-api-access-ktjzp\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.121908 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.123184 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.123267 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.226203 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef5ae851-44bb-46fa-9245-abc5b46b1771-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.226883 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.226971 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227061 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227148 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ef5ae851-44bb-46fa-9245-abc5b46b1771-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227252 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjzp\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-kube-api-access-ktjzp\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227368 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227491 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227577 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227703 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.227828 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.229336 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ef5ae851-44bb-46fa-9245-abc5b46b1771-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.233341 4696 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.233400 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5fca820ef489b1541daddc3ea9aef396303f957b46bb94f2636f2ae9edc8d588/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.237039 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.237897 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.239485 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.239908 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.241087 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.249919 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef5ae851-44bb-46fa-9245-abc5b46b1771-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.250295 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.258618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef5ae851-44bb-46fa-9245-abc5b46b1771-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.259514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjzp\" (UniqueName: \"kubernetes.io/projected/ef5ae851-44bb-46fa-9245-abc5b46b1771-kube-api-access-ktjzp\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.317050 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab2c851f-258a-4469-a351-d04930617bdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab2c851f-258a-4469-a351-d04930617bdc\") pod \"prometheus-metric-storage-0\" (UID: \"ef5ae851-44bb-46fa-9245-abc5b46b1771\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.367768 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.688552 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:31:46 crc kubenswrapper[4696]: I1202 23:31:46.957691 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerStarted","Data":"85729c5756a0baa48f7aab0ce26aa83d7a6391d3074f78acd4fcf3e173d25e0d"} Dec 02 23:31:47 crc kubenswrapper[4696]: I1202 23:31:47.445843 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f684370c-9731-4837-9e30-675a1f07992d" path="/var/lib/kubelet/pods/f684370c-9731-4837-9e30-675a1f07992d/volumes" Dec 02 23:31:52 crc kubenswrapper[4696]: I1202 23:31:52.025662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerStarted","Data":"185483b490010233fdc29b534e9bdc74a3907ca674a6138bc4f4f3cbc63eca43"} Dec 02 23:32:02 crc kubenswrapper[4696]: I1202 23:32:02.171332 4696 generic.go:334] "Generic (PLEG): container finished" podID="ef5ae851-44bb-46fa-9245-abc5b46b1771" containerID="185483b490010233fdc29b534e9bdc74a3907ca674a6138bc4f4f3cbc63eca43" exitCode=0 Dec 02 23:32:02 crc kubenswrapper[4696]: I1202 23:32:02.171442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerDied","Data":"185483b490010233fdc29b534e9bdc74a3907ca674a6138bc4f4f3cbc63eca43"} Dec 02 23:32:03 crc kubenswrapper[4696]: I1202 23:32:03.194314 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerStarted","Data":"305ad3a6489883472c53854430aa7a33fbcf6f4e6ad98bdd465bb7c95b99c5eb"} Dec 02 23:32:07 crc kubenswrapper[4696]: I1202 23:32:07.247452 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerStarted","Data":"9fa8d14ff1c99e037359e65e6d9b853f3601179f81ee537ed0f65ace521b72a8"} Dec 02 23:32:08 crc kubenswrapper[4696]: I1202 23:32:08.265388 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ef5ae851-44bb-46fa-9245-abc5b46b1771","Type":"ContainerStarted","Data":"9bdbdb285a238d2b52024ee772faa96f224ba66d28e5f2b5a25d5259892e3f2a"} Dec 02 23:32:08 crc kubenswrapper[4696]: I1202 23:32:08.328016 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.327988198 podStartE2EDuration="23.327988198s" podCreationTimestamp="2025-12-02 23:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:32:08.324813188 +0000 UTC m=+2991.205493219" watchObservedRunningTime="2025-12-02 23:32:08.327988198 +0000 UTC m=+2991.208668199" Dec 02 23:32:11 crc kubenswrapper[4696]: I1202 23:32:11.369006 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:32:16 crc kubenswrapper[4696]: I1202 23:32:16.369256 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:32:16 crc kubenswrapper[4696]: I1202 23:32:16.387659 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:32:17 crc kubenswrapper[4696]: I1202 23:32:17.389663 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:32:37 crc kubenswrapper[4696]: I1202 23:32:37.636526 4696 scope.go:117] "RemoveContainer" containerID="5904a25317bb6f8f674fde8baca90ba8678adb0286936a3a753d35d329879616" Dec 02 23:32:37 crc kubenswrapper[4696]: I1202 23:32:37.697913 4696 scope.go:117] "RemoveContainer" containerID="ddfc235b5bb0772d57e16495a8462d5cf42ca4182a8f3779f03d2787ed733fac" Dec 02 23:32:37 crc kubenswrapper[4696]: I1202 23:32:37.730194 4696 scope.go:117] "RemoveContainer" containerID="75db3f0dd5722131cb38a869bf56e7046b5407824c01119b9a156cd34d3b0781" Dec 02 23:32:37 crc kubenswrapper[4696]: I1202 23:32:37.780847 4696 scope.go:117] "RemoveContainer" containerID="e9339fa31bcb4e462050530a6dd43d8e802377053cce141770f0f2fbcfb627e3" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.811657 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.813339 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.816357 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.816705 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.816972 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.817429 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w6blf" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.827760 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921275 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921347 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921372 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921395 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921445 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hls5b\" (UniqueName: \"kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921509 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921554 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921577 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:38 crc kubenswrapper[4696]: I1202 23:32:38.921597 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023547 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hls5b\" (UniqueName: \"kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023650 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023701 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023725 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023803 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023867 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023894 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023917 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.023941 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.024346 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.024693 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.025149 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.025763 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.025969 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.039385 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.039640 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.039678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.045801 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hls5b\" (UniqueName: \"kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.063647 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.136850 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 23:32:39 crc kubenswrapper[4696]: I1202 23:32:39.725124 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:32:40 crc kubenswrapper[4696]: I1202 23:32:40.667074 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4881d1aa-7494-45fe-b21b-5cae7bfe2f41","Type":"ContainerStarted","Data":"e80116cf4964c80bd3f3d2af6576647888f0966a6ae74ef91fa217d9c1200079"} Dec 02 23:32:52 crc kubenswrapper[4696]: I1202 23:32:52.823102 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4881d1aa-7494-45fe-b21b-5cae7bfe2f41","Type":"ContainerStarted","Data":"60700ed9991efb7d98d1cd75467245e2863222c007e4d6ba953c590fd95638f0"} Dec 02 23:32:52 crc kubenswrapper[4696]: I1202 23:32:52.850708 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.16180102 podStartE2EDuration="15.850684929s" podCreationTimestamp="2025-12-02 23:32:37 +0000 UTC" firstStartedPulling="2025-12-02 23:32:39.734434083 +0000 UTC m=+3022.615114084" lastFinishedPulling="2025-12-02 23:32:51.423317992 +0000 UTC m=+3034.303997993" observedRunningTime="2025-12-02 23:32:52.845525953 +0000 UTC m=+3035.726205964" watchObservedRunningTime="2025-12-02 23:32:52.850684929 +0000 UTC m=+3035.731364930" Dec 02 23:33:22 crc kubenswrapper[4696]: I1202 23:33:22.974453 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:33:22 crc kubenswrapper[4696]: I1202 23:33:22.975376 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:33:52 crc kubenswrapper[4696]: I1202 23:33:52.974442 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:33:52 crc kubenswrapper[4696]: I1202 23:33:52.975213 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:34:22 crc kubenswrapper[4696]: I1202 23:34:22.974187 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:34:22 crc kubenswrapper[4696]: I1202 23:34:22.975188 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:34:22 crc kubenswrapper[4696]: I1202 23:34:22.975264 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:34:22 crc kubenswrapper[4696]: I1202 23:34:22.976698 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:34:22 crc kubenswrapper[4696]: I1202 23:34:22.976883 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" gracePeriod=600 Dec 02 23:34:23 crc kubenswrapper[4696]: E1202 23:34:23.107334 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:34:23 crc kubenswrapper[4696]: I1202 23:34:23.986286 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" exitCode=0 Dec 02 23:34:23 crc kubenswrapper[4696]: I1202 23:34:23.986731 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168"} Dec 02 23:34:23 crc kubenswrapper[4696]: I1202 23:34:23.986792 4696 scope.go:117] "RemoveContainer" containerID="8065f2d7bf333b48d27aa170233dbb7ad2ef7a547f6e30836fa968a411dbfeae" Dec 02 23:34:23 crc kubenswrapper[4696]: I1202 23:34:23.987548 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:34:23 crc kubenswrapper[4696]: E1202 23:34:23.987834 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:34:39 crc kubenswrapper[4696]: I1202 23:34:39.432230 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:34:39 crc kubenswrapper[4696]: E1202 23:34:39.433440 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:34:50 crc kubenswrapper[4696]: I1202 23:34:50.432490 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:34:50 crc kubenswrapper[4696]: E1202 23:34:50.433898 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:35:02 crc kubenswrapper[4696]: I1202 23:35:02.432398 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:35:02 crc kubenswrapper[4696]: E1202 23:35:02.433566 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:35:13 crc kubenswrapper[4696]: I1202 23:35:13.432519 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:35:13 crc kubenswrapper[4696]: E1202 23:35:13.433582 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:35:28 crc kubenswrapper[4696]: I1202 23:35:28.432303 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:35:28 crc kubenswrapper[4696]: E1202 23:35:28.433446 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:35:39 crc kubenswrapper[4696]: I1202 23:35:39.433988 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:35:39 crc kubenswrapper[4696]: E1202 23:35:39.435239 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:35:51 crc kubenswrapper[4696]: I1202 23:35:51.432475 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:35:51 crc kubenswrapper[4696]: E1202 23:35:51.433777 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:36:03 crc kubenswrapper[4696]: I1202 23:36:03.432865 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:36:03 crc kubenswrapper[4696]: E1202 23:36:03.434225 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:36:17 crc kubenswrapper[4696]: I1202 23:36:17.444797 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:36:17 crc kubenswrapper[4696]: E1202 23:36:17.446408 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:36:31 crc kubenswrapper[4696]: I1202 23:36:31.529124 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:36:31 crc kubenswrapper[4696]: E1202 23:36:31.530109 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:36:42 crc kubenswrapper[4696]: I1202 23:36:42.432108 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:36:42 crc kubenswrapper[4696]: E1202 23:36:42.433185 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:36:53 crc kubenswrapper[4696]: I1202 23:36:53.432555 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:36:53 crc kubenswrapper[4696]: E1202 23:36:53.433955 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:37:06 crc kubenswrapper[4696]: I1202 23:37:06.432559 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:37:06 crc kubenswrapper[4696]: E1202 23:37:06.434122 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:37:17 crc kubenswrapper[4696]: I1202 23:37:17.439840 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:37:17 crc kubenswrapper[4696]: E1202 23:37:17.442058 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:37:31 crc kubenswrapper[4696]: I1202 23:37:31.578476 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:37:31 crc kubenswrapper[4696]: E1202 23:37:31.580485 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:37:43 crc kubenswrapper[4696]: I1202 23:37:43.432365 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:37:43 crc kubenswrapper[4696]: E1202 23:37:43.433710 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:37:56 crc kubenswrapper[4696]: I1202 23:37:56.432272 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:37:56 crc kubenswrapper[4696]: E1202 23:37:56.433243 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:38:08 crc kubenswrapper[4696]: I1202 23:38:08.432794 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:38:08 crc kubenswrapper[4696]: E1202 23:38:08.434213 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:38:21 crc kubenswrapper[4696]: I1202 23:38:21.432547 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:38:21 crc kubenswrapper[4696]: E1202 23:38:21.433514 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:38:36 crc kubenswrapper[4696]: I1202 23:38:36.432698 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:38:36 crc kubenswrapper[4696]: E1202 23:38:36.433605 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:38:49 crc kubenswrapper[4696]: I1202 23:38:49.431922 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:38:49 crc kubenswrapper[4696]: E1202 23:38:49.432869 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:39:01 crc kubenswrapper[4696]: I1202 23:39:01.432459 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:39:01 crc kubenswrapper[4696]: E1202 23:39:01.434282 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:39:13 crc kubenswrapper[4696]: I1202 23:39:13.432117 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:39:13 crc kubenswrapper[4696]: E1202 23:39:13.433283 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:39:24 crc kubenswrapper[4696]: I1202 23:39:24.432093 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:39:25 crc kubenswrapper[4696]: I1202 23:39:25.551580 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b"} Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.374661 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.379799 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.391402 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.460652 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.461352 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ft4t\" (UniqueName: \"kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.461393 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.565422 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.566088 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.566386 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ft4t\" (UniqueName: \"kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.566449 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.567010 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.575266 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.577522 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.597939 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ft4t\" (UniqueName: \"kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t\") pod \"certified-operators-z8db2\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.604780 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.669069 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5fb\" (UniqueName: \"kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.669254 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.669332 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.717537 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.771069 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5fb\" (UniqueName: \"kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.771165 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.771200 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.771839 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.772098 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.792129 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5fb\" (UniqueName: \"kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb\") pod \"community-operators-729c4\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:02 crc kubenswrapper[4696]: I1202 23:41:02.946534 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.455303 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.680288 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:03 crc kubenswrapper[4696]: W1202 23:41:03.681901 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ca0c70_2881_48d4_a890_a924f7c351db.slice/crio-30ab4b2f5520647e8704cf041d7b968012dfc8657e7eaf220847489cb31e546a WatchSource:0}: Error finding container 30ab4b2f5520647e8704cf041d7b968012dfc8657e7eaf220847489cb31e546a: Status 404 returned error can't find the container with id 30ab4b2f5520647e8704cf041d7b968012dfc8657e7eaf220847489cb31e546a Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.729356 4696 generic.go:334] "Generic (PLEG): container finished" podID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerID="52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf" exitCode=0 Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.729512 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerDied","Data":"52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf"} Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.729760 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerStarted","Data":"b9cb396abfd6c96e60e083eb46df38fa250bd18939ea3fb482fd70305a8dd901"} Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.734430 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerStarted","Data":"30ab4b2f5520647e8704cf041d7b968012dfc8657e7eaf220847489cb31e546a"} Dec 02 23:41:03 crc kubenswrapper[4696]: I1202 23:41:03.734590 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:41:04 crc kubenswrapper[4696]: I1202 23:41:04.746377 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerStarted","Data":"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776"} Dec 02 23:41:04 crc kubenswrapper[4696]: I1202 23:41:04.751202 4696 generic.go:334] "Generic (PLEG): container finished" podID="43ca0c70-2881-48d4-a890-a924f7c351db" containerID="e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30" exitCode=0 Dec 02 23:41:04 crc kubenswrapper[4696]: I1202 23:41:04.751267 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerDied","Data":"e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30"} Dec 02 23:41:05 crc kubenswrapper[4696]: I1202 23:41:05.763644 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerStarted","Data":"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03"} Dec 02 23:41:05 crc kubenswrapper[4696]: I1202 23:41:05.765583 4696 generic.go:334] "Generic (PLEG): container finished" podID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerID="e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776" exitCode=0 Dec 02 23:41:05 crc kubenswrapper[4696]: I1202 23:41:05.765636 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerDied","Data":"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776"} Dec 02 23:41:06 crc kubenswrapper[4696]: I1202 23:41:06.778686 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerStarted","Data":"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c"} Dec 02 23:41:06 crc kubenswrapper[4696]: I1202 23:41:06.780488 4696 generic.go:334] "Generic (PLEG): container finished" podID="43ca0c70-2881-48d4-a890-a924f7c351db" containerID="45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03" exitCode=0 Dec 02 23:41:06 crc kubenswrapper[4696]: I1202 23:41:06.780540 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerDied","Data":"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03"} Dec 02 23:41:06 crc kubenswrapper[4696]: I1202 23:41:06.802547 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z8db2" podStartSLOduration=2.188652946 podStartE2EDuration="4.802524455s" podCreationTimestamp="2025-12-02 23:41:02 +0000 UTC" firstStartedPulling="2025-12-02 23:41:03.732705591 +0000 UTC m=+3526.613385592" lastFinishedPulling="2025-12-02 23:41:06.34657708 +0000 UTC m=+3529.227257101" observedRunningTime="2025-12-02 23:41:06.800391174 +0000 UTC m=+3529.681071175" watchObservedRunningTime="2025-12-02 23:41:06.802524455 +0000 UTC m=+3529.683204446" Dec 02 23:41:08 crc kubenswrapper[4696]: I1202 23:41:08.806435 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerStarted","Data":"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd"} Dec 02 23:41:08 crc kubenswrapper[4696]: I1202 23:41:08.852937 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-729c4" podStartSLOduration=3.6765931739999997 podStartE2EDuration="6.852907379s" podCreationTimestamp="2025-12-02 23:41:02 +0000 UTC" firstStartedPulling="2025-12-02 23:41:04.755765103 +0000 UTC m=+3527.636445104" lastFinishedPulling="2025-12-02 23:41:07.932079308 +0000 UTC m=+3530.812759309" observedRunningTime="2025-12-02 23:41:08.833054628 +0000 UTC m=+3531.713734669" watchObservedRunningTime="2025-12-02 23:41:08.852907379 +0000 UTC m=+3531.733587400" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.718726 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.719694 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.787185 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.930199 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.946809 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:12 crc kubenswrapper[4696]: I1202 23:41:12.946901 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:13 crc kubenswrapper[4696]: I1202 23:41:13.037094 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:13 crc kubenswrapper[4696]: I1202 23:41:13.912495 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:14 crc kubenswrapper[4696]: I1202 23:41:14.372494 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:14 crc kubenswrapper[4696]: I1202 23:41:14.870567 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z8db2" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="registry-server" containerID="cri-o://9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c" gracePeriod=2 Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.369263 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.444013 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.614694 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content\") pod \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.614831 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ft4t\" (UniqueName: \"kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t\") pod \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.614928 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities\") pod \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\" (UID: \"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49\") " Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.615809 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities" (OuterVolumeSpecName: "utilities") pod "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" (UID: "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.626015 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t" (OuterVolumeSpecName: "kube-api-access-6ft4t") pod "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" (UID: "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49"). InnerVolumeSpecName "kube-api-access-6ft4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.673384 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" (UID: "9ae9211d-1bef-4287-9e0e-22d6ce9fbf49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.717939 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.717983 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ft4t\" (UniqueName: \"kubernetes.io/projected/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-kube-api-access-6ft4t\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.718000 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886118 4696 generic.go:334] "Generic (PLEG): container finished" podID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerID="9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c" exitCode=0 Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886191 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8db2" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886200 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerDied","Data":"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c"} Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886266 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8db2" event={"ID":"9ae9211d-1bef-4287-9e0e-22d6ce9fbf49","Type":"ContainerDied","Data":"b9cb396abfd6c96e60e083eb46df38fa250bd18939ea3fb482fd70305a8dd901"} Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886301 4696 scope.go:117] "RemoveContainer" containerID="9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.886933 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-729c4" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="registry-server" containerID="cri-o://0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd" gracePeriod=2 Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.919084 4696 scope.go:117] "RemoveContainer" containerID="e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776" Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.930620 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.942648 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z8db2"] Dec 02 23:41:15 crc kubenswrapper[4696]: I1202 23:41:15.952106 4696 scope.go:117] "RemoveContainer" containerID="52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.144543 4696 scope.go:117] "RemoveContainer" containerID="9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c" Dec 02 23:41:16 crc kubenswrapper[4696]: E1202 23:41:16.145766 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c\": container with ID starting with 9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c not found: ID does not exist" containerID="9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.145839 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c"} err="failed to get container status \"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c\": rpc error: code = NotFound desc = could not find container \"9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c\": container with ID starting with 9e2a5ad0d1b8c8d3b1c28a28854978acbc3f9e5934b6953006b35a4d4061e07c not found: ID does not exist" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.145881 4696 scope.go:117] "RemoveContainer" containerID="e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776" Dec 02 23:41:16 crc kubenswrapper[4696]: E1202 23:41:16.146253 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776\": container with ID starting with e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776 not found: ID does not exist" containerID="e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.146289 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776"} err="failed to get container status \"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776\": rpc error: code = NotFound desc = could not find container \"e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776\": container with ID starting with e0223dc5054bab7aaf097840dfbc690c74d79e133c7a55bacaf953c67c4b3776 not found: ID does not exist" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.146313 4696 scope.go:117] "RemoveContainer" containerID="52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf" Dec 02 23:41:16 crc kubenswrapper[4696]: E1202 23:41:16.146584 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf\": container with ID starting with 52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf not found: ID does not exist" containerID="52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.146636 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf"} err="failed to get container status \"52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf\": rpc error: code = NotFound desc = could not find container \"52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf\": container with ID starting with 52bf33afe125307d806dd31dc65a6bb582aa268d2bc3428e633f6fe0069fd0bf not found: ID does not exist" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.402672 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.535714 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q5fb\" (UniqueName: \"kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb\") pod \"43ca0c70-2881-48d4-a890-a924f7c351db\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.535920 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content\") pod \"43ca0c70-2881-48d4-a890-a924f7c351db\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.536221 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities\") pod \"43ca0c70-2881-48d4-a890-a924f7c351db\" (UID: \"43ca0c70-2881-48d4-a890-a924f7c351db\") " Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.540163 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities" (OuterVolumeSpecName: "utilities") pod "43ca0c70-2881-48d4-a890-a924f7c351db" (UID: "43ca0c70-2881-48d4-a890-a924f7c351db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.545111 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb" (OuterVolumeSpecName: "kube-api-access-2q5fb") pod "43ca0c70-2881-48d4-a890-a924f7c351db" (UID: "43ca0c70-2881-48d4-a890-a924f7c351db"). InnerVolumeSpecName "kube-api-access-2q5fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.596085 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43ca0c70-2881-48d4-a890-a924f7c351db" (UID: "43ca0c70-2881-48d4-a890-a924f7c351db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.638484 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.638526 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q5fb\" (UniqueName: \"kubernetes.io/projected/43ca0c70-2881-48d4-a890-a924f7c351db-kube-api-access-2q5fb\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.638539 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ca0c70-2881-48d4-a890-a924f7c351db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.900133 4696 generic.go:334] "Generic (PLEG): container finished" podID="43ca0c70-2881-48d4-a890-a924f7c351db" containerID="0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd" exitCode=0 Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.900243 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729c4" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.900271 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerDied","Data":"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd"} Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.902252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729c4" event={"ID":"43ca0c70-2881-48d4-a890-a924f7c351db","Type":"ContainerDied","Data":"30ab4b2f5520647e8704cf041d7b968012dfc8657e7eaf220847489cb31e546a"} Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.902276 4696 scope.go:117] "RemoveContainer" containerID="0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.931462 4696 scope.go:117] "RemoveContainer" containerID="45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03" Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.958775 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.973727 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-729c4"] Dec 02 23:41:16 crc kubenswrapper[4696]: I1202 23:41:16.992601 4696 scope.go:117] "RemoveContainer" containerID="e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.029548 4696 scope.go:117] "RemoveContainer" containerID="0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.030123 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd\": container with ID starting with 0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd not found: ID does not exist" containerID="0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.030198 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd"} err="failed to get container status \"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd\": rpc error: code = NotFound desc = could not find container \"0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd\": container with ID starting with 0fc0338af40ed1a948cf9ca857d8e4f720cc716cef4dc3520eb45b3b127a9bcd not found: ID does not exist" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.030248 4696 scope.go:117] "RemoveContainer" containerID="45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.033208 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03\": container with ID starting with 45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03 not found: ID does not exist" containerID="45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.033261 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03"} err="failed to get container status \"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03\": rpc error: code = NotFound desc = could not find container \"45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03\": container with ID starting with 45b60597138ecc169f789bf58d1739acc76c23274a7158e1788abdd92eda5b03 not found: ID does not exist" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.033292 4696 scope.go:117] "RemoveContainer" containerID="e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.033837 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30\": container with ID starting with e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30 not found: ID does not exist" containerID="e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.034007 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30"} err="failed to get container status \"e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30\": rpc error: code = NotFound desc = could not find container \"e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30\": container with ID starting with e9df302ae37ce4fdc17a5d985e0dc111d9d4dc5a4122bbca5142715fdb044f30 not found: ID does not exist" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.446240 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" path="/var/lib/kubelet/pods/43ca0c70-2881-48d4-a890-a924f7c351db/volumes" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.447383 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" path="/var/lib/kubelet/pods/9ae9211d-1bef-4287-9e0e-22d6ce9fbf49/volumes" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.573718 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574623 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="extract-utilities" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574643 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="extract-utilities" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574669 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="extract-content" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574678 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="extract-content" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574686 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574694 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574712 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="extract-utilities" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574718 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="extract-utilities" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574726 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="extract-content" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574732 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="extract-content" Dec 02 23:41:17 crc kubenswrapper[4696]: E1202 23:41:17.574772 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574777 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.574993 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ca0c70-2881-48d4-a890-a924f7c351db" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.575009 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae9211d-1bef-4287-9e0e-22d6ce9fbf49" containerName="registry-server" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.576634 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.590895 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.663946 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.664166 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.664480 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdwp\" (UniqueName: \"kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.767356 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdwp\" (UniqueName: \"kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.767515 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.767600 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.768363 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.768462 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.800811 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdwp\" (UniqueName: \"kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp\") pod \"redhat-operators-6bzln\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:17 crc kubenswrapper[4696]: I1202 23:41:17.914420 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:18 crc kubenswrapper[4696]: I1202 23:41:18.424953 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:18 crc kubenswrapper[4696]: I1202 23:41:18.928698 4696 generic.go:334] "Generic (PLEG): container finished" podID="370e7916-b443-4b3f-82b3-8978f37793b9" containerID="ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f" exitCode=0 Dec 02 23:41:18 crc kubenswrapper[4696]: I1202 23:41:18.928904 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerDied","Data":"ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f"} Dec 02 23:41:18 crc kubenswrapper[4696]: I1202 23:41:18.929269 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerStarted","Data":"821ad6b8bfc82c6d99284a5acc48e1d6d9e305a14387cd97f50c8faf841ecb60"} Dec 02 23:41:19 crc kubenswrapper[4696]: I1202 23:41:19.963835 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerStarted","Data":"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a"} Dec 02 23:41:23 crc kubenswrapper[4696]: I1202 23:41:23.003650 4696 generic.go:334] "Generic (PLEG): container finished" podID="370e7916-b443-4b3f-82b3-8978f37793b9" containerID="4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a" exitCode=0 Dec 02 23:41:23 crc kubenswrapper[4696]: I1202 23:41:23.003808 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerDied","Data":"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a"} Dec 02 23:41:24 crc kubenswrapper[4696]: I1202 23:41:24.025196 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerStarted","Data":"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6"} Dec 02 23:41:24 crc kubenswrapper[4696]: I1202 23:41:24.054153 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6bzln" podStartSLOduration=2.426443826 podStartE2EDuration="7.054131184s" podCreationTimestamp="2025-12-02 23:41:17 +0000 UTC" firstStartedPulling="2025-12-02 23:41:18.931158549 +0000 UTC m=+3541.811838550" lastFinishedPulling="2025-12-02 23:41:23.558845887 +0000 UTC m=+3546.439525908" observedRunningTime="2025-12-02 23:41:24.050601464 +0000 UTC m=+3546.931281475" watchObservedRunningTime="2025-12-02 23:41:24.054131184 +0000 UTC m=+3546.934811195" Dec 02 23:41:27 crc kubenswrapper[4696]: I1202 23:41:27.915132 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:27 crc kubenswrapper[4696]: I1202 23:41:27.917298 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:28 crc kubenswrapper[4696]: I1202 23:41:28.969160 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6bzln" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="registry-server" probeResult="failure" output=< Dec 02 23:41:28 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 23:41:28 crc kubenswrapper[4696]: > Dec 02 23:41:34 crc kubenswrapper[4696]: I1202 23:41:34.926544 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:34 crc kubenswrapper[4696]: I1202 23:41:34.933056 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:34 crc kubenswrapper[4696]: I1202 23:41:34.946056 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.086132 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.086182 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.086495 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z72w\" (UniqueName: \"kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.189622 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z72w\" (UniqueName: \"kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.189737 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.189785 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.190481 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.190870 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.232536 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z72w\" (UniqueName: \"kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w\") pod \"redhat-marketplace-8q594\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.270349 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:35 crc kubenswrapper[4696]: W1202 23:41:35.891183 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25733c6d_3c68_4243_902a_7cc102d26924.slice/crio-53b905f4b1cbac45c9c04c7141b472dcc097df5bcaccf05b78199597da2572c4 WatchSource:0}: Error finding container 53b905f4b1cbac45c9c04c7141b472dcc097df5bcaccf05b78199597da2572c4: Status 404 returned error can't find the container with id 53b905f4b1cbac45c9c04c7141b472dcc097df5bcaccf05b78199597da2572c4 Dec 02 23:41:35 crc kubenswrapper[4696]: I1202 23:41:35.895852 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:36 crc kubenswrapper[4696]: I1202 23:41:36.171988 4696 generic.go:334] "Generic (PLEG): container finished" podID="25733c6d-3c68-4243-902a-7cc102d26924" containerID="4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5" exitCode=0 Dec 02 23:41:36 crc kubenswrapper[4696]: I1202 23:41:36.172107 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerDied","Data":"4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5"} Dec 02 23:41:36 crc kubenswrapper[4696]: I1202 23:41:36.172557 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerStarted","Data":"53b905f4b1cbac45c9c04c7141b472dcc097df5bcaccf05b78199597da2572c4"} Dec 02 23:41:37 crc kubenswrapper[4696]: I1202 23:41:37.195764 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerStarted","Data":"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785"} Dec 02 23:41:37 crc kubenswrapper[4696]: I1202 23:41:37.998647 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:38 crc kubenswrapper[4696]: I1202 23:41:38.062528 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:38 crc kubenswrapper[4696]: I1202 23:41:38.208581 4696 generic.go:334] "Generic (PLEG): container finished" podID="25733c6d-3c68-4243-902a-7cc102d26924" containerID="dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785" exitCode=0 Dec 02 23:41:38 crc kubenswrapper[4696]: I1202 23:41:38.208692 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerDied","Data":"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785"} Dec 02 23:41:39 crc kubenswrapper[4696]: I1202 23:41:39.222357 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerStarted","Data":"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4"} Dec 02 23:41:39 crc kubenswrapper[4696]: I1202 23:41:39.257794 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8q594" podStartSLOduration=2.740319114 podStartE2EDuration="5.257767397s" podCreationTimestamp="2025-12-02 23:41:34 +0000 UTC" firstStartedPulling="2025-12-02 23:41:36.177424666 +0000 UTC m=+3559.058104667" lastFinishedPulling="2025-12-02 23:41:38.694872949 +0000 UTC m=+3561.575552950" observedRunningTime="2025-12-02 23:41:39.247107156 +0000 UTC m=+3562.127787197" watchObservedRunningTime="2025-12-02 23:41:39.257767397 +0000 UTC m=+3562.138447398" Dec 02 23:41:45 crc kubenswrapper[4696]: I1202 23:41:45.272236 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:45 crc kubenswrapper[4696]: I1202 23:41:45.273214 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:45 crc kubenswrapper[4696]: I1202 23:41:45.346839 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:45 crc kubenswrapper[4696]: I1202 23:41:45.406832 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:46 crc kubenswrapper[4696]: I1202 23:41:46.513124 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:46 crc kubenswrapper[4696]: I1202 23:41:46.513975 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6bzln" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="registry-server" containerID="cri-o://315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6" gracePeriod=2 Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.038670 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.199493 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wdwp\" (UniqueName: \"kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp\") pod \"370e7916-b443-4b3f-82b3-8978f37793b9\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.199607 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content\") pod \"370e7916-b443-4b3f-82b3-8978f37793b9\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.200048 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities\") pod \"370e7916-b443-4b3f-82b3-8978f37793b9\" (UID: \"370e7916-b443-4b3f-82b3-8978f37793b9\") " Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.206202 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities" (OuterVolumeSpecName: "utilities") pod "370e7916-b443-4b3f-82b3-8978f37793b9" (UID: "370e7916-b443-4b3f-82b3-8978f37793b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.227360 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp" (OuterVolumeSpecName: "kube-api-access-5wdwp") pod "370e7916-b443-4b3f-82b3-8978f37793b9" (UID: "370e7916-b443-4b3f-82b3-8978f37793b9"). InnerVolumeSpecName "kube-api-access-5wdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.305623 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.305673 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wdwp\" (UniqueName: \"kubernetes.io/projected/370e7916-b443-4b3f-82b3-8978f37793b9-kube-api-access-5wdwp\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.319715 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "370e7916-b443-4b3f-82b3-8978f37793b9" (UID: "370e7916-b443-4b3f-82b3-8978f37793b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.323951 4696 generic.go:334] "Generic (PLEG): container finished" podID="370e7916-b443-4b3f-82b3-8978f37793b9" containerID="315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6" exitCode=0 Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.324012 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerDied","Data":"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6"} Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.324052 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bzln" event={"ID":"370e7916-b443-4b3f-82b3-8978f37793b9","Type":"ContainerDied","Data":"821ad6b8bfc82c6d99284a5acc48e1d6d9e305a14387cd97f50c8faf841ecb60"} Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.324054 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bzln" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.324075 4696 scope.go:117] "RemoveContainer" containerID="315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.350957 4696 scope.go:117] "RemoveContainer" containerID="4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.365636 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.382860 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6bzln"] Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.392709 4696 scope.go:117] "RemoveContainer" containerID="ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.408156 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370e7916-b443-4b3f-82b3-8978f37793b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.436834 4696 scope.go:117] "RemoveContainer" containerID="315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6" Dec 02 23:41:47 crc kubenswrapper[4696]: E1202 23:41:47.438016 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6\": container with ID starting with 315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6 not found: ID does not exist" containerID="315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.438106 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6"} err="failed to get container status \"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6\": rpc error: code = NotFound desc = could not find container \"315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6\": container with ID starting with 315b2026beb7ff42b6635a7f3b69e48736af2ee4f50be5ecf692e0032a0c12e6 not found: ID does not exist" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.438177 4696 scope.go:117] "RemoveContainer" containerID="4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a" Dec 02 23:41:47 crc kubenswrapper[4696]: E1202 23:41:47.438607 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a\": container with ID starting with 4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a not found: ID does not exist" containerID="4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.438714 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a"} err="failed to get container status \"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a\": rpc error: code = NotFound desc = could not find container \"4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a\": container with ID starting with 4301c86d9884bd3d590858d5ed1f4c724c2c1999632ea699c7d5d32be7fe696a not found: ID does not exist" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.438818 4696 scope.go:117] "RemoveContainer" containerID="ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f" Dec 02 23:41:47 crc kubenswrapper[4696]: E1202 23:41:47.439180 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f\": container with ID starting with ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f not found: ID does not exist" containerID="ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.439265 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f"} err="failed to get container status \"ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f\": rpc error: code = NotFound desc = could not find container \"ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f\": container with ID starting with ef0cd926ec1284390517ad312b991689cd98c1e81790e60d99e512223ffd7b5f not found: ID does not exist" Dec 02 23:41:47 crc kubenswrapper[4696]: I1202 23:41:47.445914 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" path="/var/lib/kubelet/pods/370e7916-b443-4b3f-82b3-8978f37793b9/volumes" Dec 02 23:41:49 crc kubenswrapper[4696]: I1202 23:41:49.706908 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:49 crc kubenswrapper[4696]: I1202 23:41:49.708005 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8q594" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="registry-server" containerID="cri-o://d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4" gracePeriod=2 Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.255637 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.364077 4696 generic.go:334] "Generic (PLEG): container finished" podID="25733c6d-3c68-4243-902a-7cc102d26924" containerID="d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4" exitCode=0 Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.364150 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerDied","Data":"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4"} Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.364180 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q594" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.364197 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q594" event={"ID":"25733c6d-3c68-4243-902a-7cc102d26924","Type":"ContainerDied","Data":"53b905f4b1cbac45c9c04c7141b472dcc097df5bcaccf05b78199597da2572c4"} Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.364228 4696 scope.go:117] "RemoveContainer" containerID="d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.392557 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z72w\" (UniqueName: \"kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w\") pod \"25733c6d-3c68-4243-902a-7cc102d26924\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.392945 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content\") pod \"25733c6d-3c68-4243-902a-7cc102d26924\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.393084 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities\") pod \"25733c6d-3c68-4243-902a-7cc102d26924\" (UID: \"25733c6d-3c68-4243-902a-7cc102d26924\") " Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.398246 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities" (OuterVolumeSpecName: "utilities") pod "25733c6d-3c68-4243-902a-7cc102d26924" (UID: "25733c6d-3c68-4243-902a-7cc102d26924"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.401065 4696 scope.go:117] "RemoveContainer" containerID="dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.404360 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w" (OuterVolumeSpecName: "kube-api-access-5z72w") pod "25733c6d-3c68-4243-902a-7cc102d26924" (UID: "25733c6d-3c68-4243-902a-7cc102d26924"). InnerVolumeSpecName "kube-api-access-5z72w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.430499 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25733c6d-3c68-4243-902a-7cc102d26924" (UID: "25733c6d-3c68-4243-902a-7cc102d26924"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.471233 4696 scope.go:117] "RemoveContainer" containerID="4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.497167 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z72w\" (UniqueName: \"kubernetes.io/projected/25733c6d-3c68-4243-902a-7cc102d26924-kube-api-access-5z72w\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.497224 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.497256 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25733c6d-3c68-4243-902a-7cc102d26924-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.527870 4696 scope.go:117] "RemoveContainer" containerID="d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4" Dec 02 23:41:50 crc kubenswrapper[4696]: E1202 23:41:50.528531 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4\": container with ID starting with d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4 not found: ID does not exist" containerID="d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.528607 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4"} err="failed to get container status \"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4\": rpc error: code = NotFound desc = could not find container \"d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4\": container with ID starting with d90c58aee6842b2c7a89f25d9b2d73b6269c333439bda48c461dd413b9364cc4 not found: ID does not exist" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.528655 4696 scope.go:117] "RemoveContainer" containerID="dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785" Dec 02 23:41:50 crc kubenswrapper[4696]: E1202 23:41:50.529290 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785\": container with ID starting with dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785 not found: ID does not exist" containerID="dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.529334 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785"} err="failed to get container status \"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785\": rpc error: code = NotFound desc = could not find container \"dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785\": container with ID starting with dfcacf34ed79fb20b1c58fe4678b96ed4a49a7c13a95c6b409ef707b020fb785 not found: ID does not exist" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.529350 4696 scope.go:117] "RemoveContainer" containerID="4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5" Dec 02 23:41:50 crc kubenswrapper[4696]: E1202 23:41:50.529791 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5\": container with ID starting with 4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5 not found: ID does not exist" containerID="4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.529847 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5"} err="failed to get container status \"4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5\": rpc error: code = NotFound desc = could not find container \"4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5\": container with ID starting with 4313779999f58ce5f34eb372adc4048bd92dc346d9dc30deb22d2c9e223b6fe5 not found: ID does not exist" Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.702199 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:50 crc kubenswrapper[4696]: I1202 23:41:50.712825 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q594"] Dec 02 23:41:51 crc kubenswrapper[4696]: I1202 23:41:51.448935 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25733c6d-3c68-4243-902a-7cc102d26924" path="/var/lib/kubelet/pods/25733c6d-3c68-4243-902a-7cc102d26924/volumes" Dec 02 23:41:52 crc kubenswrapper[4696]: I1202 23:41:52.974274 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:41:52 crc kubenswrapper[4696]: I1202 23:41:52.974775 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:42:22 crc kubenswrapper[4696]: I1202 23:42:22.974428 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:42:22 crc kubenswrapper[4696]: I1202 23:42:22.975471 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:42:52 crc kubenswrapper[4696]: I1202 23:42:52.973687 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:42:52 crc kubenswrapper[4696]: I1202 23:42:52.974701 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:42:52 crc kubenswrapper[4696]: I1202 23:42:52.974806 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:42:52 crc kubenswrapper[4696]: I1202 23:42:52.976158 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:42:52 crc kubenswrapper[4696]: I1202 23:42:52.976276 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b" gracePeriod=600 Dec 02 23:42:53 crc kubenswrapper[4696]: I1202 23:42:53.268991 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b" exitCode=0 Dec 02 23:42:53 crc kubenswrapper[4696]: I1202 23:42:53.269083 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b"} Dec 02 23:42:53 crc kubenswrapper[4696]: I1202 23:42:53.269470 4696 scope.go:117] "RemoveContainer" containerID="73eda48f4864032e0b69f81fb6a4cfe9fc2c55cd1b56397bf0fc4aa0a4b62168" Dec 02 23:42:54 crc kubenswrapper[4696]: I1202 23:42:54.284995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011"} Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.198772 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x"] Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200389 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="extract-content" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200405 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="extract-content" Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200442 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200449 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200474 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="extract-content" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200479 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="extract-content" Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200493 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200499 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200525 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="extract-utilities" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200532 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="extract-utilities" Dec 02 23:45:00 crc kubenswrapper[4696]: E1202 23:45:00.200560 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="extract-utilities" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.200567 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="extract-utilities" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.201002 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="25733c6d-3c68-4243-902a-7cc102d26924" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.201032 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e7916-b443-4b3f-82b3-8978f37793b9" containerName="registry-server" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.202138 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.206806 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.207067 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.239787 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x"] Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.333278 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.333709 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktpt6\" (UniqueName: \"kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.333842 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.436580 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.437304 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktpt6\" (UniqueName: \"kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.437406 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.439468 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.446661 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.455963 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktpt6\" (UniqueName: \"kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6\") pod \"collect-profiles-29411985-59q7x\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:00 crc kubenswrapper[4696]: I1202 23:45:00.527269 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:01 crc kubenswrapper[4696]: I1202 23:45:01.076413 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x"] Dec 02 23:45:01 crc kubenswrapper[4696]: I1202 23:45:01.923996 4696 generic.go:334] "Generic (PLEG): container finished" podID="190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" containerID="7a9d9dd3d838107725ee0f43606cf2c82bf9c990916acccb438aebbc09c37405" exitCode=0 Dec 02 23:45:01 crc kubenswrapper[4696]: I1202 23:45:01.924062 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" event={"ID":"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0","Type":"ContainerDied","Data":"7a9d9dd3d838107725ee0f43606cf2c82bf9c990916acccb438aebbc09c37405"} Dec 02 23:45:01 crc kubenswrapper[4696]: I1202 23:45:01.924099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" event={"ID":"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0","Type":"ContainerStarted","Data":"e54f045a042d8f5942c61589d99ec30bb067e7a5bc4a4bbb4dc6310e99f8189f"} Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.387287 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.506703 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktpt6\" (UniqueName: \"kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6\") pod \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.507034 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume\") pod \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.507167 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume\") pod \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\" (UID: \"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0\") " Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.508241 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" (UID: "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.515003 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" (UID: "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.515504 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6" (OuterVolumeSpecName: "kube-api-access-ktpt6") pod "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" (UID: "190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0"). InnerVolumeSpecName "kube-api-access-ktpt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.609974 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.610021 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.610034 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktpt6\" (UniqueName: \"kubernetes.io/projected/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0-kube-api-access-ktpt6\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.963683 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" event={"ID":"190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0","Type":"ContainerDied","Data":"e54f045a042d8f5942c61589d99ec30bb067e7a5bc4a4bbb4dc6310e99f8189f"} Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.964137 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e54f045a042d8f5942c61589d99ec30bb067e7a5bc4a4bbb4dc6310e99f8189f" Dec 02 23:45:03 crc kubenswrapper[4696]: I1202 23:45:03.964274 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x" Dec 02 23:45:04 crc kubenswrapper[4696]: I1202 23:45:04.482178 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4"] Dec 02 23:45:04 crc kubenswrapper[4696]: I1202 23:45:04.494302 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-z9xq4"] Dec 02 23:45:05 crc kubenswrapper[4696]: I1202 23:45:05.447903 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c442b172-a329-4974-a896-e36bd604cf10" path="/var/lib/kubelet/pods/c442b172-a329-4974-a896-e36bd604cf10/volumes" Dec 02 23:45:22 crc kubenswrapper[4696]: I1202 23:45:22.974092 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:45:22 crc kubenswrapper[4696]: I1202 23:45:22.974851 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:45:38 crc kubenswrapper[4696]: I1202 23:45:38.235155 4696 scope.go:117] "RemoveContainer" containerID="b475b52b8fa8826639dce84ea06bb0dae3cfbe33afdf08411b93969ae748ee08" Dec 02 23:45:52 crc kubenswrapper[4696]: I1202 23:45:52.974392 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:45:52 crc kubenswrapper[4696]: I1202 23:45:52.975178 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:46:22 crc kubenswrapper[4696]: I1202 23:46:22.974666 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:46:22 crc kubenswrapper[4696]: I1202 23:46:22.975521 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:46:22 crc kubenswrapper[4696]: I1202 23:46:22.975607 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:46:22 crc kubenswrapper[4696]: I1202 23:46:22.976943 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:46:22 crc kubenswrapper[4696]: I1202 23:46:22.977055 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" gracePeriod=600 Dec 02 23:46:23 crc kubenswrapper[4696]: E1202 23:46:23.129662 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:46:23 crc kubenswrapper[4696]: I1202 23:46:23.971940 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" exitCode=0 Dec 02 23:46:23 crc kubenswrapper[4696]: I1202 23:46:23.972287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011"} Dec 02 23:46:23 crc kubenswrapper[4696]: I1202 23:46:23.972570 4696 scope.go:117] "RemoveContainer" containerID="c3dadcb47bbdf8fa03bd4ae613ed1247b206e2b96f53e86e2dbee5fd3ba26e1b" Dec 02 23:46:23 crc kubenswrapper[4696]: I1202 23:46:23.973664 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:46:23 crc kubenswrapper[4696]: E1202 23:46:23.974314 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:46:39 crc kubenswrapper[4696]: I1202 23:46:39.433674 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:46:39 crc kubenswrapper[4696]: E1202 23:46:39.435674 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:46:50 crc kubenswrapper[4696]: I1202 23:46:50.431599 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:46:50 crc kubenswrapper[4696]: E1202 23:46:50.432630 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:47:05 crc kubenswrapper[4696]: I1202 23:47:05.432369 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:47:05 crc kubenswrapper[4696]: E1202 23:47:05.433201 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:47:12 crc kubenswrapper[4696]: E1202 23:47:12.282734 4696 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:41828->38.102.83.9:41669: write tcp 38.102.83.9:41828->38.102.83.9:41669: write: broken pipe Dec 02 23:47:17 crc kubenswrapper[4696]: I1202 23:47:17.440377 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:47:17 crc kubenswrapper[4696]: E1202 23:47:17.441371 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:47:31 crc kubenswrapper[4696]: I1202 23:47:31.433150 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:47:31 crc kubenswrapper[4696]: E1202 23:47:31.434792 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:47:46 crc kubenswrapper[4696]: I1202 23:47:46.432039 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:47:46 crc kubenswrapper[4696]: E1202 23:47:46.433461 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:47:58 crc kubenswrapper[4696]: I1202 23:47:58.432814 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:47:58 crc kubenswrapper[4696]: E1202 23:47:58.434415 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:48:10 crc kubenswrapper[4696]: I1202 23:48:10.432390 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:48:10 crc kubenswrapper[4696]: E1202 23:48:10.433335 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:48:21 crc kubenswrapper[4696]: I1202 23:48:21.431487 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:48:21 crc kubenswrapper[4696]: E1202 23:48:21.432524 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:48:32 crc kubenswrapper[4696]: I1202 23:48:32.431803 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:48:32 crc kubenswrapper[4696]: E1202 23:48:32.432689 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:48:43 crc kubenswrapper[4696]: I1202 23:48:43.433142 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:48:43 crc kubenswrapper[4696]: E1202 23:48:43.434214 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:48:54 crc kubenswrapper[4696]: I1202 23:48:54.432471 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:48:54 crc kubenswrapper[4696]: E1202 23:48:54.433536 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:49:06 crc kubenswrapper[4696]: I1202 23:49:06.434710 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:49:06 crc kubenswrapper[4696]: E1202 23:49:06.436442 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:49:18 crc kubenswrapper[4696]: I1202 23:49:18.432692 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:49:18 crc kubenswrapper[4696]: E1202 23:49:18.433758 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:49:30 crc kubenswrapper[4696]: I1202 23:49:30.432408 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:49:30 crc kubenswrapper[4696]: E1202 23:49:30.433868 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:49:42 crc kubenswrapper[4696]: I1202 23:49:42.431994 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:49:42 crc kubenswrapper[4696]: E1202 23:49:42.434036 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:49:54 crc kubenswrapper[4696]: I1202 23:49:54.432850 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:49:54 crc kubenswrapper[4696]: E1202 23:49:54.434368 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:50:07 crc kubenswrapper[4696]: I1202 23:50:07.438810 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:50:07 crc kubenswrapper[4696]: E1202 23:50:07.439933 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:50:18 crc kubenswrapper[4696]: I1202 23:50:18.432189 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:50:18 crc kubenswrapper[4696]: E1202 23:50:18.433651 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:50:30 crc kubenswrapper[4696]: I1202 23:50:30.432232 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:50:30 crc kubenswrapper[4696]: E1202 23:50:30.433288 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:50:42 crc kubenswrapper[4696]: I1202 23:50:42.432593 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:50:42 crc kubenswrapper[4696]: E1202 23:50:42.433709 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:50:56 crc kubenswrapper[4696]: I1202 23:50:56.432797 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:50:56 crc kubenswrapper[4696]: E1202 23:50:56.434386 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:51:09 crc kubenswrapper[4696]: I1202 23:51:09.432535 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:51:09 crc kubenswrapper[4696]: E1202 23:51:09.436600 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:51:21 crc kubenswrapper[4696]: I1202 23:51:21.431976 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:51:21 crc kubenswrapper[4696]: E1202 23:51:21.433100 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.525338 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:23 crc kubenswrapper[4696]: E1202 23:51:23.530540 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" containerName="collect-profiles" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.530574 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" containerName="collect-profiles" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.530935 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" containerName="collect-profiles" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.533084 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.548150 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.571042 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkqqk\" (UniqueName: \"kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.571231 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.571383 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.673382 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkqqk\" (UniqueName: \"kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.673663 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.673951 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.674669 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.674677 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.711648 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkqqk\" (UniqueName: \"kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk\") pod \"community-operators-jf228\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:23 crc kubenswrapper[4696]: I1202 23:51:23.871042 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:24 crc kubenswrapper[4696]: I1202 23:51:24.515856 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:24 crc kubenswrapper[4696]: I1202 23:51:24.647993 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerStarted","Data":"66d600fd439c90fa938118b1a3f94477c6041fc822e4cde0ac9fc7d7c50eefeb"} Dec 02 23:51:25 crc kubenswrapper[4696]: I1202 23:51:25.668596 4696 generic.go:334] "Generic (PLEG): container finished" podID="211a6206-054a-4460-b90d-f587ccfced6c" containerID="9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f" exitCode=0 Dec 02 23:51:25 crc kubenswrapper[4696]: I1202 23:51:25.668679 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerDied","Data":"9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f"} Dec 02 23:51:25 crc kubenswrapper[4696]: I1202 23:51:25.674123 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:51:26 crc kubenswrapper[4696]: I1202 23:51:26.680845 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerStarted","Data":"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702"} Dec 02 23:51:28 crc kubenswrapper[4696]: I1202 23:51:28.709413 4696 generic.go:334] "Generic (PLEG): container finished" podID="211a6206-054a-4460-b90d-f587ccfced6c" containerID="04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702" exitCode=0 Dec 02 23:51:28 crc kubenswrapper[4696]: I1202 23:51:28.709466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerDied","Data":"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702"} Dec 02 23:51:29 crc kubenswrapper[4696]: I1202 23:51:29.729183 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerStarted","Data":"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0"} Dec 02 23:51:29 crc kubenswrapper[4696]: I1202 23:51:29.757949 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jf228" podStartSLOduration=3.258125455 podStartE2EDuration="6.757924855s" podCreationTimestamp="2025-12-02 23:51:23 +0000 UTC" firstStartedPulling="2025-12-02 23:51:25.673874262 +0000 UTC m=+4148.554554263" lastFinishedPulling="2025-12-02 23:51:29.173673652 +0000 UTC m=+4152.054353663" observedRunningTime="2025-12-02 23:51:29.751463583 +0000 UTC m=+4152.632143594" watchObservedRunningTime="2025-12-02 23:51:29.757924855 +0000 UTC m=+4152.638604866" Dec 02 23:51:33 crc kubenswrapper[4696]: I1202 23:51:33.872845 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:33 crc kubenswrapper[4696]: I1202 23:51:33.873845 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:33 crc kubenswrapper[4696]: I1202 23:51:33.945759 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:34 crc kubenswrapper[4696]: I1202 23:51:34.883733 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:34 crc kubenswrapper[4696]: I1202 23:51:34.945482 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:35 crc kubenswrapper[4696]: I1202 23:51:35.432835 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:51:35 crc kubenswrapper[4696]: I1202 23:51:35.845783 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08"} Dec 02 23:51:36 crc kubenswrapper[4696]: I1202 23:51:36.857823 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jf228" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="registry-server" containerID="cri-o://078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0" gracePeriod=2 Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.508151 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.540934 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content\") pod \"211a6206-054a-4460-b90d-f587ccfced6c\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.541030 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities\") pod \"211a6206-054a-4460-b90d-f587ccfced6c\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.541242 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkqqk\" (UniqueName: \"kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk\") pod \"211a6206-054a-4460-b90d-f587ccfced6c\" (UID: \"211a6206-054a-4460-b90d-f587ccfced6c\") " Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.546385 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities" (OuterVolumeSpecName: "utilities") pod "211a6206-054a-4460-b90d-f587ccfced6c" (UID: "211a6206-054a-4460-b90d-f587ccfced6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.554872 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk" (OuterVolumeSpecName: "kube-api-access-tkqqk") pod "211a6206-054a-4460-b90d-f587ccfced6c" (UID: "211a6206-054a-4460-b90d-f587ccfced6c"). InnerVolumeSpecName "kube-api-access-tkqqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.604008 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "211a6206-054a-4460-b90d-f587ccfced6c" (UID: "211a6206-054a-4460-b90d-f587ccfced6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.643960 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.644013 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211a6206-054a-4460-b90d-f587ccfced6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.644024 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkqqk\" (UniqueName: \"kubernetes.io/projected/211a6206-054a-4460-b90d-f587ccfced6c-kube-api-access-tkqqk\") on node \"crc\" DevicePath \"\"" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.872024 4696 generic.go:334] "Generic (PLEG): container finished" podID="211a6206-054a-4460-b90d-f587ccfced6c" containerID="078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0" exitCode=0 Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.872099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerDied","Data":"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0"} Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.872202 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jf228" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.872593 4696 scope.go:117] "RemoveContainer" containerID="078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.872570 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jf228" event={"ID":"211a6206-054a-4460-b90d-f587ccfced6c","Type":"ContainerDied","Data":"66d600fd439c90fa938118b1a3f94477c6041fc822e4cde0ac9fc7d7c50eefeb"} Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.910531 4696 scope.go:117] "RemoveContainer" containerID="04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702" Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.929810 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.939970 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jf228"] Dec 02 23:51:37 crc kubenswrapper[4696]: I1202 23:51:37.956109 4696 scope.go:117] "RemoveContainer" containerID="9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.018174 4696 scope.go:117] "RemoveContainer" containerID="078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0" Dec 02 23:51:38 crc kubenswrapper[4696]: E1202 23:51:38.019005 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0\": container with ID starting with 078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0 not found: ID does not exist" containerID="078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.019063 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0"} err="failed to get container status \"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0\": rpc error: code = NotFound desc = could not find container \"078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0\": container with ID starting with 078ae3b04d306befac6945eeacd574ca003702e9b69d5d5ddd271f472625cbe0 not found: ID does not exist" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.019106 4696 scope.go:117] "RemoveContainer" containerID="04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702" Dec 02 23:51:38 crc kubenswrapper[4696]: E1202 23:51:38.019702 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702\": container with ID starting with 04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702 not found: ID does not exist" containerID="04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.019778 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702"} err="failed to get container status \"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702\": rpc error: code = NotFound desc = could not find container \"04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702\": container with ID starting with 04c6ffbcad0dc4dcd0c5bc9b428dd9820ea11b0e67e1bf65b9ea5a8ba92ad702 not found: ID does not exist" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.019819 4696 scope.go:117] "RemoveContainer" containerID="9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f" Dec 02 23:51:38 crc kubenswrapper[4696]: E1202 23:51:38.020574 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f\": container with ID starting with 9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f not found: ID does not exist" containerID="9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f" Dec 02 23:51:38 crc kubenswrapper[4696]: I1202 23:51:38.020641 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f"} err="failed to get container status \"9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f\": rpc error: code = NotFound desc = could not find container \"9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f\": container with ID starting with 9d8d846c0791f38c82fead347da8a277221e507bb17ef8a4e45e0125fa49d61f not found: ID does not exist" Dec 02 23:51:39 crc kubenswrapper[4696]: I1202 23:51:39.448890 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211a6206-054a-4460-b90d-f587ccfced6c" path="/var/lib/kubelet/pods/211a6206-054a-4460-b90d-f587ccfced6c/volumes" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.968530 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:51:49 crc kubenswrapper[4696]: E1202 23:51:49.970218 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="extract-content" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.970238 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="extract-content" Dec 02 23:51:49 crc kubenswrapper[4696]: E1202 23:51:49.970257 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="registry-server" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.970267 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="registry-server" Dec 02 23:51:49 crc kubenswrapper[4696]: E1202 23:51:49.970311 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="extract-utilities" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.970320 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="extract-utilities" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.970620 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="211a6206-054a-4460-b90d-f587ccfced6c" containerName="registry-server" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.972673 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:49 crc kubenswrapper[4696]: I1202 23:51:49.981730 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.132390 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.133103 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtrd\" (UniqueName: \"kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.133145 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.235228 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.235423 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtrd\" (UniqueName: \"kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.235454 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.235860 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.235930 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.275391 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtrd\" (UniqueName: \"kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd\") pod \"redhat-operators-gx2t2\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.296860 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:51:50 crc kubenswrapper[4696]: I1202 23:51:50.810054 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:51:51 crc kubenswrapper[4696]: I1202 23:51:51.082073 4696 generic.go:334] "Generic (PLEG): container finished" podID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerID="a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130" exitCode=0 Dec 02 23:51:51 crc kubenswrapper[4696]: I1202 23:51:51.082468 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerDied","Data":"a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130"} Dec 02 23:51:51 crc kubenswrapper[4696]: I1202 23:51:51.082506 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerStarted","Data":"57cfbc248bd434b398e10529c58546e5d72b7aa16da0db1d3243c3a35f700eb4"} Dec 02 23:51:53 crc kubenswrapper[4696]: I1202 23:51:53.111715 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerStarted","Data":"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b"} Dec 02 23:51:55 crc kubenswrapper[4696]: I1202 23:51:55.148578 4696 generic.go:334] "Generic (PLEG): container finished" podID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerID="936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b" exitCode=0 Dec 02 23:51:55 crc kubenswrapper[4696]: I1202 23:51:55.148691 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerDied","Data":"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b"} Dec 02 23:51:56 crc kubenswrapper[4696]: I1202 23:51:56.161873 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerStarted","Data":"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb"} Dec 02 23:51:56 crc kubenswrapper[4696]: I1202 23:51:56.187687 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gx2t2" podStartSLOduration=2.522403582 podStartE2EDuration="7.187662612s" podCreationTimestamp="2025-12-02 23:51:49 +0000 UTC" firstStartedPulling="2025-12-02 23:51:51.085201192 +0000 UTC m=+4173.965881193" lastFinishedPulling="2025-12-02 23:51:55.750460222 +0000 UTC m=+4178.631140223" observedRunningTime="2025-12-02 23:51:56.183761892 +0000 UTC m=+4179.064441893" watchObservedRunningTime="2025-12-02 23:51:56.187662612 +0000 UTC m=+4179.068342623" Dec 02 23:52:00 crc kubenswrapper[4696]: I1202 23:52:00.298419 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:00 crc kubenswrapper[4696]: I1202 23:52:00.300069 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:01 crc kubenswrapper[4696]: I1202 23:52:01.728116 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gx2t2" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="registry-server" probeResult="failure" output=< Dec 02 23:52:01 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 02 23:52:01 crc kubenswrapper[4696]: > Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.180967 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.187118 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.218179 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.299629 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhz2\" (UniqueName: \"kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.299913 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.300430 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.402980 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.403129 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhz2\" (UniqueName: \"kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.403177 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.403821 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.403853 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.432509 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhz2\" (UniqueName: \"kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2\") pod \"redhat-marketplace-m6pb8\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:09 crc kubenswrapper[4696]: I1202 23:52:09.512172 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.021061 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.333399 4696 generic.go:334] "Generic (PLEG): container finished" podID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerID="acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032" exitCode=0 Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.333492 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerDied","Data":"acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032"} Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.333859 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerStarted","Data":"9eb40e2bcdc205229cf93d0bed5ac92a83715390fcbb4ca2fdced1e0aab44db9"} Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.363006 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:10 crc kubenswrapper[4696]: I1202 23:52:10.443017 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:12 crc kubenswrapper[4696]: I1202 23:52:12.743301 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:52:12 crc kubenswrapper[4696]: I1202 23:52:12.744481 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gx2t2" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="registry-server" containerID="cri-o://169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb" gracePeriod=2 Dec 02 23:52:13 crc kubenswrapper[4696]: I1202 23:52:13.376468 4696 generic.go:334] "Generic (PLEG): container finished" podID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerID="ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7" exitCode=0 Dec 02 23:52:13 crc kubenswrapper[4696]: I1202 23:52:13.376552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerDied","Data":"ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7"} Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.140136 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.323565 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities\") pod \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.323710 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtrd\" (UniqueName: \"kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd\") pod \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.323956 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content\") pod \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\" (UID: \"fdb65518-2323-4cf8-bd4f-739cd5e4ab82\") " Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.325029 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities" (OuterVolumeSpecName: "utilities") pod "fdb65518-2323-4cf8-bd4f-739cd5e4ab82" (UID: "fdb65518-2323-4cf8-bd4f-739cd5e4ab82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.391603 4696 generic.go:334] "Generic (PLEG): container finished" podID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerID="169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb" exitCode=0 Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.391668 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gx2t2" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.391690 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerDied","Data":"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb"} Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.391807 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gx2t2" event={"ID":"fdb65518-2323-4cf8-bd4f-739cd5e4ab82","Type":"ContainerDied","Data":"57cfbc248bd434b398e10529c58546e5d72b7aa16da0db1d3243c3a35f700eb4"} Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.391835 4696 scope.go:117] "RemoveContainer" containerID="169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.398358 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerStarted","Data":"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3"} Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.419698 4696 scope.go:117] "RemoveContainer" containerID="936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.423228 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m6pb8" podStartSLOduration=1.81615818 podStartE2EDuration="5.423179066s" podCreationTimestamp="2025-12-02 23:52:09 +0000 UTC" firstStartedPulling="2025-12-02 23:52:10.337045284 +0000 UTC m=+4193.217725285" lastFinishedPulling="2025-12-02 23:52:13.94406617 +0000 UTC m=+4196.824746171" observedRunningTime="2025-12-02 23:52:14.420168801 +0000 UTC m=+4197.300848802" watchObservedRunningTime="2025-12-02 23:52:14.423179066 +0000 UTC m=+4197.303859067" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.426961 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.446223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdb65518-2323-4cf8-bd4f-739cd5e4ab82" (UID: "fdb65518-2323-4cf8-bd4f-739cd5e4ab82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.530237 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.878577 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd" (OuterVolumeSpecName: "kube-api-access-twtrd") pod "fdb65518-2323-4cf8-bd4f-739cd5e4ab82" (UID: "fdb65518-2323-4cf8-bd4f-739cd5e4ab82"). InnerVolumeSpecName "kube-api-access-twtrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.900215 4696 scope.go:117] "RemoveContainer" containerID="a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130" Dec 02 23:52:14 crc kubenswrapper[4696]: I1202 23:52:14.938563 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twtrd\" (UniqueName: \"kubernetes.io/projected/fdb65518-2323-4cf8-bd4f-739cd5e4ab82-kube-api-access-twtrd\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.002851 4696 scope.go:117] "RemoveContainer" containerID="169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb" Dec 02 23:52:15 crc kubenswrapper[4696]: E1202 23:52:15.003585 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb\": container with ID starting with 169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb not found: ID does not exist" containerID="169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.003640 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb"} err="failed to get container status \"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb\": rpc error: code = NotFound desc = could not find container \"169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb\": container with ID starting with 169a7464e33f669c72b1627d104e87ba6f17fae9357b2f439922ccd5257c44cb not found: ID does not exist" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.003673 4696 scope.go:117] "RemoveContainer" containerID="936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b" Dec 02 23:52:15 crc kubenswrapper[4696]: E1202 23:52:15.004247 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b\": container with ID starting with 936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b not found: ID does not exist" containerID="936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.004316 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b"} err="failed to get container status \"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b\": rpc error: code = NotFound desc = could not find container \"936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b\": container with ID starting with 936cb6e40611cd1d53ffff00c6462e8abd51e42c81bc2ad0242c45030968291b not found: ID does not exist" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.004359 4696 scope.go:117] "RemoveContainer" containerID="a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130" Dec 02 23:52:15 crc kubenswrapper[4696]: E1202 23:52:15.005114 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130\": container with ID starting with a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130 not found: ID does not exist" containerID="a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.005154 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130"} err="failed to get container status \"a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130\": rpc error: code = NotFound desc = could not find container \"a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130\": container with ID starting with a7f8aa268e2ffc46ef361dd0c86be703763848f440b4d67d3364899f9c181130 not found: ID does not exist" Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.083016 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.095595 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gx2t2"] Dec 02 23:52:15 crc kubenswrapper[4696]: I1202 23:52:15.452705 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" path="/var/lib/kubelet/pods/fdb65518-2323-4cf8-bd4f-739cd5e4ab82/volumes" Dec 02 23:52:19 crc kubenswrapper[4696]: I1202 23:52:19.512351 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:19 crc kubenswrapper[4696]: I1202 23:52:19.513243 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:19 crc kubenswrapper[4696]: I1202 23:52:19.567271 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:20 crc kubenswrapper[4696]: I1202 23:52:20.537151 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:21 crc kubenswrapper[4696]: I1202 23:52:21.751004 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:22 crc kubenswrapper[4696]: I1202 23:52:22.500385 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m6pb8" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="registry-server" containerID="cri-o://cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3" gracePeriod=2 Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.022171 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.048148 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content\") pod \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.048267 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities\") pod \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.048371 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhz2\" (UniqueName: \"kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2\") pod \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\" (UID: \"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b\") " Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.049162 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities" (OuterVolumeSpecName: "utilities") pod "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" (UID: "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.056683 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2" (OuterVolumeSpecName: "kube-api-access-vwhz2") pod "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" (UID: "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b"). InnerVolumeSpecName "kube-api-access-vwhz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.070292 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" (UID: "48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.150797 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.150841 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.150852 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhz2\" (UniqueName: \"kubernetes.io/projected/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b-kube-api-access-vwhz2\") on node \"crc\" DevicePath \"\"" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.516019 4696 generic.go:334] "Generic (PLEG): container finished" podID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerID="cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3" exitCode=0 Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.516073 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerDied","Data":"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3"} Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.516447 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6pb8" event={"ID":"48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b","Type":"ContainerDied","Data":"9eb40e2bcdc205229cf93d0bed5ac92a83715390fcbb4ca2fdced1e0aab44db9"} Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.516474 4696 scope.go:117] "RemoveContainer" containerID="cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.516133 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6pb8" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.554602 4696 scope.go:117] "RemoveContainer" containerID="ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.564147 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.584229 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6pb8"] Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.586705 4696 scope.go:117] "RemoveContainer" containerID="acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.663662 4696 scope.go:117] "RemoveContainer" containerID="cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3" Dec 02 23:52:23 crc kubenswrapper[4696]: E1202 23:52:23.664266 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3\": container with ID starting with cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3 not found: ID does not exist" containerID="cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.664330 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3"} err="failed to get container status \"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3\": rpc error: code = NotFound desc = could not find container \"cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3\": container with ID starting with cedcf87263a1f6d7c0dc2c13282630b3150482ef5935a3eb36f9ca95ee7b2bd3 not found: ID does not exist" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.664369 4696 scope.go:117] "RemoveContainer" containerID="ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7" Dec 02 23:52:23 crc kubenswrapper[4696]: E1202 23:52:23.676150 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7\": container with ID starting with ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7 not found: ID does not exist" containerID="ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.676204 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7"} err="failed to get container status \"ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7\": rpc error: code = NotFound desc = could not find container \"ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7\": container with ID starting with ddc0edcfc436b074f092e9a13ca826e8e5d3eda95d8731384e631a56dde5b5b7 not found: ID does not exist" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.676237 4696 scope.go:117] "RemoveContainer" containerID="acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032" Dec 02 23:52:23 crc kubenswrapper[4696]: E1202 23:52:23.676684 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032\": container with ID starting with acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032 not found: ID does not exist" containerID="acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032" Dec 02 23:52:23 crc kubenswrapper[4696]: I1202 23:52:23.676705 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032"} err="failed to get container status \"acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032\": rpc error: code = NotFound desc = could not find container \"acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032\": container with ID starting with acb45e510df02b6804ac51502d04fd0e4e351164622529ba31c33739c26f5032 not found: ID does not exist" Dec 02 23:52:25 crc kubenswrapper[4696]: I1202 23:52:25.455946 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" path="/var/lib/kubelet/pods/48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b/volumes" Dec 02 23:53:52 crc kubenswrapper[4696]: I1202 23:53:52.973949 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:53:52 crc kubenswrapper[4696]: I1202 23:53:52.974895 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:22 crc kubenswrapper[4696]: I1202 23:54:22.973996 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:54:22 crc kubenswrapper[4696]: I1202 23:54:22.974703 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:52 crc kubenswrapper[4696]: I1202 23:54:52.974724 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:54:52 crc kubenswrapper[4696]: I1202 23:54:52.975384 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:52 crc kubenswrapper[4696]: I1202 23:54:52.975445 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:54:52 crc kubenswrapper[4696]: I1202 23:54:52.976447 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:54:52 crc kubenswrapper[4696]: I1202 23:54:52.976509 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08" gracePeriod=600 Dec 02 23:54:53 crc kubenswrapper[4696]: I1202 23:54:53.382559 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08" exitCode=0 Dec 02 23:54:53 crc kubenswrapper[4696]: I1202 23:54:53.382601 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08"} Dec 02 23:54:53 crc kubenswrapper[4696]: I1202 23:54:53.382730 4696 scope.go:117] "RemoveContainer" containerID="989f11838b2017b8bf102dc5cad0dbd9f0fd8376c5fc08f926460a91e9668011" Dec 02 23:54:54 crc kubenswrapper[4696]: I1202 23:54:54.401907 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b"} Dec 02 23:57:22 crc kubenswrapper[4696]: I1202 23:57:22.973905 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:57:22 crc kubenswrapper[4696]: I1202 23:57:22.974629 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:57:52 crc kubenswrapper[4696]: I1202 23:57:52.973869 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:57:52 crc kubenswrapper[4696]: I1202 23:57:52.974679 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:58:22 crc kubenswrapper[4696]: I1202 23:58:22.974811 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:58:22 crc kubenswrapper[4696]: I1202 23:58:22.975686 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:58:22 crc kubenswrapper[4696]: I1202 23:58:22.975793 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 02 23:58:22 crc kubenswrapper[4696]: I1202 23:58:22.977381 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:58:22 crc kubenswrapper[4696]: I1202 23:58:22.977516 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" gracePeriod=600 Dec 02 23:58:23 crc kubenswrapper[4696]: E1202 23:58:23.120708 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:58:23 crc kubenswrapper[4696]: I1202 23:58:23.945022 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" exitCode=0 Dec 02 23:58:23 crc kubenswrapper[4696]: I1202 23:58:23.945343 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b"} Dec 02 23:58:23 crc kubenswrapper[4696]: I1202 23:58:23.945463 4696 scope.go:117] "RemoveContainer" containerID="4b888d94eacd48a1d144fb73273a6e82d76f2dad033afa343843e553683f6e08" Dec 02 23:58:23 crc kubenswrapper[4696]: I1202 23:58:23.946666 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:58:23 crc kubenswrapper[4696]: E1202 23:58:23.947129 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.432100 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.433329 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.803143 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.803963 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="extract-utilities" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.803999 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="extract-utilities" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.804027 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="extract-content" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804037 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="extract-content" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.804058 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="extract-utilities" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804069 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="extract-utilities" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.804093 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804101 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.804123 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="extract-content" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804131 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="extract-content" Dec 02 23:58:36 crc kubenswrapper[4696]: E1202 23:58:36.804151 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804161 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804454 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="48685ba0-42f7-4a23-bbe4-0c9ea8a48e9b" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.804475 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb65518-2323-4cf8-bd4f-739cd5e4ab82" containerName="registry-server" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.806889 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.817531 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.871880 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.871974 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbmm\" (UniqueName: \"kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.872061 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.973506 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.973589 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbmm\" (UniqueName: \"kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.973642 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.974268 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:36 crc kubenswrapper[4696]: I1202 23:58:36.974508 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:37 crc kubenswrapper[4696]: I1202 23:58:37.008672 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbmm\" (UniqueName: \"kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm\") pod \"certified-operators-mfbqd\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:37 crc kubenswrapper[4696]: I1202 23:58:37.146329 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:37 crc kubenswrapper[4696]: I1202 23:58:37.749727 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:39 crc kubenswrapper[4696]: I1202 23:58:39.158001 4696 generic.go:334] "Generic (PLEG): container finished" podID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerID="58fa8af06e35de3921ccff0c395c22a70787515c71a34ee8965a06e7ae5b833a" exitCode=0 Dec 02 23:58:39 crc kubenswrapper[4696]: I1202 23:58:39.158098 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerDied","Data":"58fa8af06e35de3921ccff0c395c22a70787515c71a34ee8965a06e7ae5b833a"} Dec 02 23:58:39 crc kubenswrapper[4696]: I1202 23:58:39.159149 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerStarted","Data":"94b856c86a3f61e67d71ea29df4018563328193f96a95e948d6436b09759a656"} Dec 02 23:58:39 crc kubenswrapper[4696]: I1202 23:58:39.160621 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:58:40 crc kubenswrapper[4696]: I1202 23:58:40.173978 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerStarted","Data":"8af11bfbbdca3f02e3ca94901ee46a76e2fd933f828afd9bcbdf4e3d8facd05e"} Dec 02 23:58:41 crc kubenswrapper[4696]: I1202 23:58:41.188840 4696 generic.go:334] "Generic (PLEG): container finished" podID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerID="8af11bfbbdca3f02e3ca94901ee46a76e2fd933f828afd9bcbdf4e3d8facd05e" exitCode=0 Dec 02 23:58:41 crc kubenswrapper[4696]: I1202 23:58:41.188965 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerDied","Data":"8af11bfbbdca3f02e3ca94901ee46a76e2fd933f828afd9bcbdf4e3d8facd05e"} Dec 02 23:58:42 crc kubenswrapper[4696]: I1202 23:58:42.211897 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerStarted","Data":"2b6553a0295f061f49ddb04762a799566fb3453f611a2a323bbc5dfe2f0ae28d"} Dec 02 23:58:42 crc kubenswrapper[4696]: I1202 23:58:42.242572 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mfbqd" podStartSLOduration=3.598422868 podStartE2EDuration="6.242550618s" podCreationTimestamp="2025-12-02 23:58:36 +0000 UTC" firstStartedPulling="2025-12-02 23:58:39.160377434 +0000 UTC m=+4582.041057425" lastFinishedPulling="2025-12-02 23:58:41.804505174 +0000 UTC m=+4584.685185175" observedRunningTime="2025-12-02 23:58:42.236908048 +0000 UTC m=+4585.117588059" watchObservedRunningTime="2025-12-02 23:58:42.242550618 +0000 UTC m=+4585.123230619" Dec 02 23:58:47 crc kubenswrapper[4696]: I1202 23:58:47.147045 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:47 crc kubenswrapper[4696]: I1202 23:58:47.147775 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:47 crc kubenswrapper[4696]: I1202 23:58:47.231809 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:47 crc kubenswrapper[4696]: I1202 23:58:47.337481 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:47 crc kubenswrapper[4696]: I1202 23:58:47.469222 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:48 crc kubenswrapper[4696]: I1202 23:58:48.432660 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:58:48 crc kubenswrapper[4696]: E1202 23:58:48.433637 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:58:49 crc kubenswrapper[4696]: I1202 23:58:49.295603 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mfbqd" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="registry-server" containerID="cri-o://2b6553a0295f061f49ddb04762a799566fb3453f611a2a323bbc5dfe2f0ae28d" gracePeriod=2 Dec 02 23:58:50 crc kubenswrapper[4696]: I1202 23:58:50.309462 4696 generic.go:334] "Generic (PLEG): container finished" podID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerID="2b6553a0295f061f49ddb04762a799566fb3453f611a2a323bbc5dfe2f0ae28d" exitCode=0 Dec 02 23:58:50 crc kubenswrapper[4696]: I1202 23:58:50.309531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerDied","Data":"2b6553a0295f061f49ddb04762a799566fb3453f611a2a323bbc5dfe2f0ae28d"} Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.428273 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.537353 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content\") pod \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.537419 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities\") pod \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.537577 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbmm\" (UniqueName: \"kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm\") pod \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\" (UID: \"81b9f2d4-0e95-4b18-a589-bcad5d884a82\") " Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.538576 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities" (OuterVolumeSpecName: "utilities") pod "81b9f2d4-0e95-4b18-a589-bcad5d884a82" (UID: "81b9f2d4-0e95-4b18-a589-bcad5d884a82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.552225 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm" (OuterVolumeSpecName: "kube-api-access-tgbmm") pod "81b9f2d4-0e95-4b18-a589-bcad5d884a82" (UID: "81b9f2d4-0e95-4b18-a589-bcad5d884a82"). InnerVolumeSpecName "kube-api-access-tgbmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.591367 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81b9f2d4-0e95-4b18-a589-bcad5d884a82" (UID: "81b9f2d4-0e95-4b18-a589-bcad5d884a82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.641046 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbmm\" (UniqueName: \"kubernetes.io/projected/81b9f2d4-0e95-4b18-a589-bcad5d884a82-kube-api-access-tgbmm\") on node \"crc\" DevicePath \"\"" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.641096 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:58:51 crc kubenswrapper[4696]: I1202 23:58:51.641110 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81b9f2d4-0e95-4b18-a589-bcad5d884a82-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.335682 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfbqd" event={"ID":"81b9f2d4-0e95-4b18-a589-bcad5d884a82","Type":"ContainerDied","Data":"94b856c86a3f61e67d71ea29df4018563328193f96a95e948d6436b09759a656"} Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.335797 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfbqd" Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.336238 4696 scope.go:117] "RemoveContainer" containerID="2b6553a0295f061f49ddb04762a799566fb3453f611a2a323bbc5dfe2f0ae28d" Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.363238 4696 scope.go:117] "RemoveContainer" containerID="8af11bfbbdca3f02e3ca94901ee46a76e2fd933f828afd9bcbdf4e3d8facd05e" Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.372220 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:52 crc kubenswrapper[4696]: I1202 23:58:52.381690 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mfbqd"] Dec 02 23:58:53 crc kubenswrapper[4696]: I1202 23:58:53.093191 4696 scope.go:117] "RemoveContainer" containerID="58fa8af06e35de3921ccff0c395c22a70787515c71a34ee8965a06e7ae5b833a" Dec 02 23:58:53 crc kubenswrapper[4696]: I1202 23:58:53.475951 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" path="/var/lib/kubelet/pods/81b9f2d4-0e95-4b18-a589-bcad5d884a82/volumes" Dec 02 23:59:01 crc kubenswrapper[4696]: I1202 23:59:01.432515 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:59:01 crc kubenswrapper[4696]: E1202 23:59:01.434441 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:59:15 crc kubenswrapper[4696]: I1202 23:59:15.432852 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:59:15 crc kubenswrapper[4696]: E1202 23:59:15.434501 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:59:26 crc kubenswrapper[4696]: I1202 23:59:26.433553 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:59:26 crc kubenswrapper[4696]: E1202 23:59:26.435145 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:59:41 crc kubenswrapper[4696]: I1202 23:59:41.432028 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:59:41 crc kubenswrapper[4696]: E1202 23:59:41.433003 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 02 23:59:55 crc kubenswrapper[4696]: I1202 23:59:55.431691 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 02 23:59:55 crc kubenswrapper[4696]: E1202 23:59:55.432712 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.200371 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2"] Dec 03 00:00:00 crc kubenswrapper[4696]: E1203 00:00:00.201897 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.201920 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4696]: E1203 00:00:00.201938 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.201946 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4696]: E1203 00:00:00.201966 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.201975 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.202259 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b9f2d4-0e95-4b18-a589-bcad5d884a82" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.203337 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.207388 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.207815 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.244566 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29412000-ws52s"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.246183 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.249538 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.260447 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29412000-fd8ck"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.262124 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.267413 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277003 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277072 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277110 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jr59\" (UniqueName: \"kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277205 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277229 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277282 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.277336 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85kg2\" (UniqueName: \"kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.283969 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29412000-sbsc6"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.292539 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.296884 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.297309 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.307511 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29412000-ws52s"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.319488 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.330035 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29412000-sbsc6"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.341986 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29412000-fd8ck"] Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379596 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85kg2\" (UniqueName: \"kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379684 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379727 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48nf\" (UniqueName: \"kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379776 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379800 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379828 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jr59\" (UniqueName: \"kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379888 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dcf\" (UniqueName: \"kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379916 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379969 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.379996 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.380024 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.380086 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.380121 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.381587 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.387996 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.388110 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.388846 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.389367 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.407380 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85kg2\" (UniqueName: \"kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2\") pod \"collect-profiles-29412000-dr8v2\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.408606 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jr59\" (UniqueName: \"kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59\") pod \"nova-cell0-db-purge-29412000-ws52s\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483310 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48nf\" (UniqueName: \"kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483364 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483436 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dcf\" (UniqueName: \"kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483464 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483521 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.483624 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.486847 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.490103 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.490944 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.491346 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.504630 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48nf\" (UniqueName: \"kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf\") pod \"nova-cell1-db-purge-29412000-fd8ck\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.511180 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dcf\" (UniqueName: \"kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf\") pod \"image-pruner-29412000-sbsc6\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.587225 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.614183 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.630572 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:00 crc kubenswrapper[4696]: I1203 00:00:00.645290 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:01 crc kubenswrapper[4696]: I1203 00:00:01.181992 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2"] Dec 03 00:00:01 crc kubenswrapper[4696]: I1203 00:00:01.336640 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29412000-ws52s"] Dec 03 00:00:01 crc kubenswrapper[4696]: I1203 00:00:01.481611 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29412000-sbsc6"] Dec 03 00:00:01 crc kubenswrapper[4696]: I1203 00:00:01.491914 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29412000-fd8ck"] Dec 03 00:00:01 crc kubenswrapper[4696]: W1203 00:00:01.495502 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86a6141_746a_49f8_b35a_5fa4e334cf8e.slice/crio-98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee WatchSource:0}: Error finding container 98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee: Status 404 returned error can't find the container with id 98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.237188 4696 generic.go:334] "Generic (PLEG): container finished" podID="388060e7-4b9e-4a38-a24c-528bdb1771a0" containerID="e1a97a37ea8f50f129d68cb9597d4ffc3cf4c99ede4ee1026a1122bee3cc319f" exitCode=0 Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.237369 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" event={"ID":"388060e7-4b9e-4a38-a24c-528bdb1771a0","Type":"ContainerDied","Data":"e1a97a37ea8f50f129d68cb9597d4ffc3cf4c99ede4ee1026a1122bee3cc319f"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.237990 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" event={"ID":"388060e7-4b9e-4a38-a24c-528bdb1771a0","Type":"ContainerStarted","Data":"5fa94adeb1820c7cfc318f0a2f803a46f8203c17eba97a3eb17c84c527b7cd47"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.242633 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-ws52s" event={"ID":"8f44b5e9-136f-4cba-9f87-6bb1d73fb496","Type":"ContainerStarted","Data":"582a7925f2d9d04f54c2fa73cdd4ab728a8798b64e1df163b5a082744178b92c"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.242662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-ws52s" event={"ID":"8f44b5e9-136f-4cba-9f87-6bb1d73fb496","Type":"ContainerStarted","Data":"e44da26b35664cd1cc560cf36529c3081724381fc20021dba49faa10c8c800e4"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.250328 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-sbsc6" event={"ID":"f86a6141-746a-49f8-b35a-5fa4e334cf8e","Type":"ContainerStarted","Data":"f603b518af04553e19c7bb61528510e1a92d10e27c873f55f758c348a1f1fd11"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.250401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-sbsc6" event={"ID":"f86a6141-746a-49f8-b35a-5fa4e334cf8e","Type":"ContainerStarted","Data":"98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.296997 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29412000-ws52s" podStartSLOduration=2.296977441 podStartE2EDuration="2.296977441s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.29024164 +0000 UTC m=+4665.170921641" watchObservedRunningTime="2025-12-03 00:00:02.296977441 +0000 UTC m=+4665.177657442" Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.297479 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" event={"ID":"0a2082a5-4293-40c5-ad8d-bb7a4bc43626","Type":"ContainerStarted","Data":"24d08b115f31141a901255dae9b2f5e4569d676fea037a9f884fbd32ab7b9d82"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.297554 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" event={"ID":"0a2082a5-4293-40c5-ad8d-bb7a4bc43626","Type":"ContainerStarted","Data":"d46c72b25984bee4b2763b79402d6271bf2987d6df6c5113c36e757c8c7d58e7"} Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.340924 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29412000-sbsc6" podStartSLOduration=2.340897364 podStartE2EDuration="2.340897364s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.312902591 +0000 UTC m=+4665.193582602" watchObservedRunningTime="2025-12-03 00:00:02.340897364 +0000 UTC m=+4665.221577365" Dec 03 00:00:02 crc kubenswrapper[4696]: I1203 00:00:02.349016 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" podStartSLOduration=2.348984953 podStartE2EDuration="2.348984953s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.336295854 +0000 UTC m=+4665.216975855" watchObservedRunningTime="2025-12-03 00:00:02.348984953 +0000 UTC m=+4665.229664944" Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.310435 4696 generic.go:334] "Generic (PLEG): container finished" podID="f86a6141-746a-49f8-b35a-5fa4e334cf8e" containerID="f603b518af04553e19c7bb61528510e1a92d10e27c873f55f758c348a1f1fd11" exitCode=0 Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.311102 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-sbsc6" event={"ID":"f86a6141-746a-49f8-b35a-5fa4e334cf8e","Type":"ContainerDied","Data":"f603b518af04553e19c7bb61528510e1a92d10e27c873f55f758c348a1f1fd11"} Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.785187 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.944581 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume\") pod \"388060e7-4b9e-4a38-a24c-528bdb1771a0\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.944768 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume\") pod \"388060e7-4b9e-4a38-a24c-528bdb1771a0\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.944804 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85kg2\" (UniqueName: \"kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2\") pod \"388060e7-4b9e-4a38-a24c-528bdb1771a0\" (UID: \"388060e7-4b9e-4a38-a24c-528bdb1771a0\") " Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.946963 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "388060e7-4b9e-4a38-a24c-528bdb1771a0" (UID: "388060e7-4b9e-4a38-a24c-528bdb1771a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.952841 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2" (OuterVolumeSpecName: "kube-api-access-85kg2") pod "388060e7-4b9e-4a38-a24c-528bdb1771a0" (UID: "388060e7-4b9e-4a38-a24c-528bdb1771a0"). InnerVolumeSpecName "kube-api-access-85kg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:03 crc kubenswrapper[4696]: I1203 00:00:03.953608 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "388060e7-4b9e-4a38-a24c-528bdb1771a0" (UID: "388060e7-4b9e-4a38-a24c-528bdb1771a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.047638 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/388060e7-4b9e-4a38-a24c-528bdb1771a0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.048082 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/388060e7-4b9e-4a38-a24c-528bdb1771a0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.048096 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85kg2\" (UniqueName: \"kubernetes.io/projected/388060e7-4b9e-4a38-a24c-528bdb1771a0-kube-api-access-85kg2\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.321796 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.321791 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-dr8v2" event={"ID":"388060e7-4b9e-4a38-a24c-528bdb1771a0","Type":"ContainerDied","Data":"5fa94adeb1820c7cfc318f0a2f803a46f8203c17eba97a3eb17c84c527b7cd47"} Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.321854 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa94adeb1820c7cfc318f0a2f803a46f8203c17eba97a3eb17c84c527b7cd47" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.677693 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.765109 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48dcf\" (UniqueName: \"kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf\") pod \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.765198 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca\") pod \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\" (UID: \"f86a6141-746a-49f8-b35a-5fa4e334cf8e\") " Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.766021 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca" (OuterVolumeSpecName: "serviceca") pod "f86a6141-746a-49f8-b35a-5fa4e334cf8e" (UID: "f86a6141-746a-49f8-b35a-5fa4e334cf8e"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.772063 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf" (OuterVolumeSpecName: "kube-api-access-48dcf") pod "f86a6141-746a-49f8-b35a-5fa4e334cf8e" (UID: "f86a6141-746a-49f8-b35a-5fa4e334cf8e"). InnerVolumeSpecName "kube-api-access-48dcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.867783 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48dcf\" (UniqueName: \"kubernetes.io/projected/f86a6141-746a-49f8-b35a-5fa4e334cf8e-kube-api-access-48dcf\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.867820 4696 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f86a6141-746a-49f8-b35a-5fa4e334cf8e-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.869018 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7"] Dec 03 00:00:04 crc kubenswrapper[4696]: I1203 00:00:04.882125 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-pvdn7"] Dec 03 00:00:05 crc kubenswrapper[4696]: I1203 00:00:05.335811 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-sbsc6" event={"ID":"f86a6141-746a-49f8-b35a-5fa4e334cf8e","Type":"ContainerDied","Data":"98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee"} Dec 03 00:00:05 crc kubenswrapper[4696]: I1203 00:00:05.336208 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f1206ef720c14ca092552bf462e46aac92b2a008885a5cf45667aa7c191fee" Dec 03 00:00:05 crc kubenswrapper[4696]: I1203 00:00:05.335855 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-sbsc6" Dec 03 00:00:05 crc kubenswrapper[4696]: I1203 00:00:05.446727 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70cd65fe-f070-4d40-aa5b-dc5568eca34e" path="/var/lib/kubelet/pods/70cd65fe-f070-4d40-aa5b-dc5568eca34e/volumes" Dec 03 00:00:08 crc kubenswrapper[4696]: I1203 00:00:08.377587 4696 generic.go:334] "Generic (PLEG): container finished" podID="0a2082a5-4293-40c5-ad8d-bb7a4bc43626" containerID="24d08b115f31141a901255dae9b2f5e4569d676fea037a9f884fbd32ab7b9d82" exitCode=0 Dec 03 00:00:08 crc kubenswrapper[4696]: I1203 00:00:08.377636 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" event={"ID":"0a2082a5-4293-40c5-ad8d-bb7a4bc43626","Type":"ContainerDied","Data":"24d08b115f31141a901255dae9b2f5e4569d676fea037a9f884fbd32ab7b9d82"} Dec 03 00:00:08 crc kubenswrapper[4696]: I1203 00:00:08.383203 4696 generic.go:334] "Generic (PLEG): container finished" podID="8f44b5e9-136f-4cba-9f87-6bb1d73fb496" containerID="582a7925f2d9d04f54c2fa73cdd4ab728a8798b64e1df163b5a082744178b92c" exitCode=0 Dec 03 00:00:08 crc kubenswrapper[4696]: I1203 00:00:08.383254 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-ws52s" event={"ID":"8f44b5e9-136f-4cba-9f87-6bb1d73fb496","Type":"ContainerDied","Data":"582a7925f2d9d04f54c2fa73cdd4ab728a8798b64e1df163b5a082744178b92c"} Dec 03 00:00:08 crc kubenswrapper[4696]: I1203 00:00:08.431487 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:00:08 crc kubenswrapper[4696]: E1203 00:00:08.432040 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.892868 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.897992 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985381 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle\") pod \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985474 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts\") pod \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985522 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data\") pod \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985693 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c48nf\" (UniqueName: \"kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf\") pod \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985780 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts\") pod \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985835 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jr59\" (UniqueName: \"kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59\") pod \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985931 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle\") pod \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\" (UID: \"0a2082a5-4293-40c5-ad8d-bb7a4bc43626\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.985965 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data\") pod \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\" (UID: \"8f44b5e9-136f-4cba-9f87-6bb1d73fb496\") " Dec 03 00:00:09 crc kubenswrapper[4696]: I1203 00:00:09.992627 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts" (OuterVolumeSpecName: "scripts") pod "8f44b5e9-136f-4cba-9f87-6bb1d73fb496" (UID: "8f44b5e9-136f-4cba-9f87-6bb1d73fb496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.006561 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts" (OuterVolumeSpecName: "scripts") pod "0a2082a5-4293-40c5-ad8d-bb7a4bc43626" (UID: "0a2082a5-4293-40c5-ad8d-bb7a4bc43626"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.007214 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59" (OuterVolumeSpecName: "kube-api-access-5jr59") pod "8f44b5e9-136f-4cba-9f87-6bb1d73fb496" (UID: "8f44b5e9-136f-4cba-9f87-6bb1d73fb496"). InnerVolumeSpecName "kube-api-access-5jr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.007344 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf" (OuterVolumeSpecName: "kube-api-access-c48nf") pod "0a2082a5-4293-40c5-ad8d-bb7a4bc43626" (UID: "0a2082a5-4293-40c5-ad8d-bb7a4bc43626"). InnerVolumeSpecName "kube-api-access-c48nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.018300 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data" (OuterVolumeSpecName: "config-data") pod "8f44b5e9-136f-4cba-9f87-6bb1d73fb496" (UID: "8f44b5e9-136f-4cba-9f87-6bb1d73fb496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.021597 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a2082a5-4293-40c5-ad8d-bb7a4bc43626" (UID: "0a2082a5-4293-40c5-ad8d-bb7a4bc43626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.031227 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data" (OuterVolumeSpecName: "config-data") pod "0a2082a5-4293-40c5-ad8d-bb7a4bc43626" (UID: "0a2082a5-4293-40c5-ad8d-bb7a4bc43626"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.041531 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f44b5e9-136f-4cba-9f87-6bb1d73fb496" (UID: "8f44b5e9-136f-4cba-9f87-6bb1d73fb496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089453 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jr59\" (UniqueName: \"kubernetes.io/projected/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-kube-api-access-5jr59\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089787 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089799 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089808 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089816 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089826 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089839 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c48nf\" (UniqueName: \"kubernetes.io/projected/0a2082a5-4293-40c5-ad8d-bb7a4bc43626-kube-api-access-c48nf\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.089851 4696 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f44b5e9-136f-4cba-9f87-6bb1d73fb496-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.406489 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-ws52s" event={"ID":"8f44b5e9-136f-4cba-9f87-6bb1d73fb496","Type":"ContainerDied","Data":"e44da26b35664cd1cc560cf36529c3081724381fc20021dba49faa10c8c800e4"} Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.406557 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44da26b35664cd1cc560cf36529c3081724381fc20021dba49faa10c8c800e4" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.406647 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-ws52s" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.409847 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.409800 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-fd8ck" event={"ID":"0a2082a5-4293-40c5-ad8d-bb7a4bc43626","Type":"ContainerDied","Data":"d46c72b25984bee4b2763b79402d6271bf2987d6df6c5113c36e757c8c7d58e7"} Dec 03 00:00:10 crc kubenswrapper[4696]: I1203 00:00:10.409962 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d46c72b25984bee4b2763b79402d6271bf2987d6df6c5113c36e757c8c7d58e7" Dec 03 00:00:21 crc kubenswrapper[4696]: I1203 00:00:21.432609 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:00:21 crc kubenswrapper[4696]: E1203 00:00:21.433830 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:00:36 crc kubenswrapper[4696]: I1203 00:00:36.432325 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:00:36 crc kubenswrapper[4696]: E1203 00:00:36.433349 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:00:38 crc kubenswrapper[4696]: I1203 00:00:38.784492 4696 scope.go:117] "RemoveContainer" containerID="5a6192be7462a74e57388633d4a3a462044a84efd0e4091053ff6e1973bc0fab" Dec 03 00:00:48 crc kubenswrapper[4696]: I1203 00:00:48.432508 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:00:48 crc kubenswrapper[4696]: E1203 00:00:48.433562 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.183573 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29412001-jvcl6"] Dec 03 00:01:00 crc kubenswrapper[4696]: E1203 00:01:00.185100 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2082a5-4293-40c5-ad8d-bb7a4bc43626" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185139 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2082a5-4293-40c5-ad8d-bb7a4bc43626" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: E1203 00:01:00.185200 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86a6141-746a-49f8-b35a-5fa4e334cf8e" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185211 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86a6141-746a-49f8-b35a-5fa4e334cf8e" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4696]: E1203 00:01:00.185240 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388060e7-4b9e-4a38-a24c-528bdb1771a0" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185252 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="388060e7-4b9e-4a38-a24c-528bdb1771a0" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4696]: E1203 00:01:00.185267 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f44b5e9-136f-4cba-9f87-6bb1d73fb496" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185275 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f44b5e9-136f-4cba-9f87-6bb1d73fb496" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185576 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86a6141-746a-49f8-b35a-5fa4e334cf8e" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185595 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f44b5e9-136f-4cba-9f87-6bb1d73fb496" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185607 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="388060e7-4b9e-4a38-a24c-528bdb1771a0" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.185621 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2082a5-4293-40c5-ad8d-bb7a4bc43626" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.186863 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.198031 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412001-b2pg6"] Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.199517 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.215319 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29412001-htr7n"] Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.217565 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.222086 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.230434 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412001-b2pg6"] Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239228 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239281 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239339 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239398 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239431 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239662 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fww\" (UniqueName: \"kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239839 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239903 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7v5g\" (UniqueName: \"kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239964 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.239992 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvtp8\" (UniqueName: \"kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.240092 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.240198 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.244104 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29412001-htr7n"] Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.252908 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29412001-jvcl6"] Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.342894 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.343111 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.343235 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.343304 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fww\" (UniqueName: \"kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.343457 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.344564 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7v5g\" (UniqueName: \"kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.344653 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.344697 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvtp8\" (UniqueName: \"kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.344881 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.347382 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.347504 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.347544 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.351239 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.352409 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.352570 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.353339 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.353345 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.354604 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.355625 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.356395 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.358521 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.362032 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fww\" (UniqueName: \"kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww\") pod \"cinder-db-purge-29412001-jvcl6\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.375695 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7v5g\" (UniqueName: \"kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g\") pod \"keystone-cron-29412001-b2pg6\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.376842 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvtp8\" (UniqueName: \"kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8\") pod \"glance-db-purge-29412001-htr7n\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.510871 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.529426 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:00 crc kubenswrapper[4696]: I1203 00:01:00.545647 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:01 crc kubenswrapper[4696]: I1203 00:01:01.206518 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29412001-jvcl6"] Dec 03 00:01:01 crc kubenswrapper[4696]: I1203 00:01:01.217063 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29412001-htr7n"] Dec 03 00:01:01 crc kubenswrapper[4696]: W1203 00:01:01.218860 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc2c412_c05d_4914_b5a4_e0a0e40b8a59.slice/crio-76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591 WatchSource:0}: Error finding container 76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591: Status 404 returned error can't find the container with id 76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591 Dec 03 00:01:01 crc kubenswrapper[4696]: W1203 00:01:01.223627 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc211c59c_65a7_4672_8a9e_7b9d20220ef5.slice/crio-b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84 WatchSource:0}: Error finding container b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84: Status 404 returned error can't find the container with id b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84 Dec 03 00:01:01 crc kubenswrapper[4696]: I1203 00:01:01.227592 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412001-b2pg6"] Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.008255 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-b2pg6" event={"ID":"c211c59c-65a7-4672-8a9e-7b9d20220ef5","Type":"ContainerStarted","Data":"bfbfb9860563bbd969ab9559a645f068e07aaf6f5c8358d1cb144f2c64a6ad6e"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.009058 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-b2pg6" event={"ID":"c211c59c-65a7-4672-8a9e-7b9d20220ef5","Type":"ContainerStarted","Data":"b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.011517 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-jvcl6" event={"ID":"f9b76748-e694-4766-a355-d01c0fc857e0","Type":"ContainerStarted","Data":"920f64e6199418acc45aaafdaa8abcbbbe658b79ceae759b308d562868c9728b"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.011607 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-jvcl6" event={"ID":"f9b76748-e694-4766-a355-d01c0fc857e0","Type":"ContainerStarted","Data":"1c26bc10d06b27214810536e59176341f9fcd335fb756b2ef05de31f248872a1"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.013099 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-htr7n" event={"ID":"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59","Type":"ContainerStarted","Data":"e84350bcb3a87cab43d07b5c203dcd998cb5b81a8fa523fcf2adc4523377d415"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.013153 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-htr7n" event={"ID":"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59","Type":"ContainerStarted","Data":"76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591"} Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.056425 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29412001-jvcl6" podStartSLOduration=2.056404846 podStartE2EDuration="2.056404846s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.052911337 +0000 UTC m=+4724.933591338" watchObservedRunningTime="2025-12-03 00:01:02.056404846 +0000 UTC m=+4724.937084847" Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.056880 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412001-b2pg6" podStartSLOduration=2.056876099 podStartE2EDuration="2.056876099s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.035206556 +0000 UTC m=+4724.915886567" watchObservedRunningTime="2025-12-03 00:01:02.056876099 +0000 UTC m=+4724.937556100" Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.080592 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29412001-htr7n" podStartSLOduration=2.08056993 podStartE2EDuration="2.08056993s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.074274162 +0000 UTC m=+4724.954954163" watchObservedRunningTime="2025-12-03 00:01:02.08056993 +0000 UTC m=+4724.961249931" Dec 03 00:01:02 crc kubenswrapper[4696]: I1203 00:01:02.432592 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:01:02 crc kubenswrapper[4696]: E1203 00:01:02.432984 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:01:04 crc kubenswrapper[4696]: I1203 00:01:04.037819 4696 generic.go:334] "Generic (PLEG): container finished" podID="c211c59c-65a7-4672-8a9e-7b9d20220ef5" containerID="bfbfb9860563bbd969ab9559a645f068e07aaf6f5c8358d1cb144f2c64a6ad6e" exitCode=0 Dec 03 00:01:04 crc kubenswrapper[4696]: I1203 00:01:04.037875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-b2pg6" event={"ID":"c211c59c-65a7-4672-8a9e-7b9d20220ef5","Type":"ContainerDied","Data":"bfbfb9860563bbd969ab9559a645f068e07aaf6f5c8358d1cb144f2c64a6ad6e"} Dec 03 00:01:04 crc kubenswrapper[4696]: I1203 00:01:04.043907 4696 generic.go:334] "Generic (PLEG): container finished" podID="2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" containerID="e84350bcb3a87cab43d07b5c203dcd998cb5b81a8fa523fcf2adc4523377d415" exitCode=0 Dec 03 00:01:04 crc kubenswrapper[4696]: I1203 00:01:04.043976 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-htr7n" event={"ID":"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59","Type":"ContainerDied","Data":"e84350bcb3a87cab43d07b5c203dcd998cb5b81a8fa523fcf2adc4523377d415"} Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.057706 4696 generic.go:334] "Generic (PLEG): container finished" podID="f9b76748-e694-4766-a355-d01c0fc857e0" containerID="920f64e6199418acc45aaafdaa8abcbbbe658b79ceae759b308d562868c9728b" exitCode=0 Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.057822 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-jvcl6" event={"ID":"f9b76748-e694-4766-a355-d01c0fc857e0","Type":"ContainerDied","Data":"920f64e6199418acc45aaafdaa8abcbbbe658b79ceae759b308d562868c9728b"} Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.669445 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.674940 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.797970 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvtp8\" (UniqueName: \"kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8\") pod \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.798963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys\") pod \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800145 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle\") pod \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800219 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data\") pod \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800373 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data\") pod \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800398 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7v5g\" (UniqueName: \"kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g\") pod \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\" (UID: \"c211c59c-65a7-4672-8a9e-7b9d20220ef5\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800417 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data\") pod \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.800497 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle\") pod \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\" (UID: \"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59\") " Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.806380 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" (UID: "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.809105 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8" (OuterVolumeSpecName: "kube-api-access-jvtp8") pod "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" (UID: "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59"). InnerVolumeSpecName "kube-api-access-jvtp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.809223 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c211c59c-65a7-4672-8a9e-7b9d20220ef5" (UID: "c211c59c-65a7-4672-8a9e-7b9d20220ef5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.809904 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g" (OuterVolumeSpecName: "kube-api-access-q7v5g") pod "c211c59c-65a7-4672-8a9e-7b9d20220ef5" (UID: "c211c59c-65a7-4672-8a9e-7b9d20220ef5"). InnerVolumeSpecName "kube-api-access-q7v5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.837306 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data" (OuterVolumeSpecName: "config-data") pod "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" (UID: "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.837553 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" (UID: "2cc2c412-c05d-4914-b5a4-e0a0e40b8a59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.838792 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c211c59c-65a7-4672-8a9e-7b9d20220ef5" (UID: "c211c59c-65a7-4672-8a9e-7b9d20220ef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.877983 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data" (OuterVolumeSpecName: "config-data") pod "c211c59c-65a7-4672-8a9e-7b9d20220ef5" (UID: "c211c59c-65a7-4672-8a9e-7b9d20220ef5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904343 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904392 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904401 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904411 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7v5g\" (UniqueName: \"kubernetes.io/projected/c211c59c-65a7-4672-8a9e-7b9d20220ef5-kube-api-access-q7v5g\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904422 4696 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904430 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904439 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvtp8\" (UniqueName: \"kubernetes.io/projected/2cc2c412-c05d-4914-b5a4-e0a0e40b8a59-kube-api-access-jvtp8\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:05 crc kubenswrapper[4696]: I1203 00:01:05.904447 4696 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c211c59c-65a7-4672-8a9e-7b9d20220ef5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.085963 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-b2pg6" event={"ID":"c211c59c-65a7-4672-8a9e-7b9d20220ef5","Type":"ContainerDied","Data":"b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84"} Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.086020 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8d1e3b84f98e2cdedaf1df683b1d9974b2069a7835efc9855f49f620195cd84" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.086116 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-b2pg6" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.089932 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-htr7n" event={"ID":"2cc2c412-c05d-4914-b5a4-e0a0e40b8a59","Type":"ContainerDied","Data":"76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591"} Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.089978 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76187b7258dd2a2d77adead8f76574451391afcb639eb5ff06530ae7e9aca591" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.090009 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-htr7n" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.399367 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.517129 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data\") pod \"f9b76748-e694-4766-a355-d01c0fc857e0\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.517253 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fww\" (UniqueName: \"kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww\") pod \"f9b76748-e694-4766-a355-d01c0fc857e0\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.517287 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data\") pod \"f9b76748-e694-4766-a355-d01c0fc857e0\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.517368 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle\") pod \"f9b76748-e694-4766-a355-d01c0fc857e0\" (UID: \"f9b76748-e694-4766-a355-d01c0fc857e0\") " Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.524106 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww" (OuterVolumeSpecName: "kube-api-access-45fww") pod "f9b76748-e694-4766-a355-d01c0fc857e0" (UID: "f9b76748-e694-4766-a355-d01c0fc857e0"). InnerVolumeSpecName "kube-api-access-45fww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.524145 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "f9b76748-e694-4766-a355-d01c0fc857e0" (UID: "f9b76748-e694-4766-a355-d01c0fc857e0"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.551394 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9b76748-e694-4766-a355-d01c0fc857e0" (UID: "f9b76748-e694-4766-a355-d01c0fc857e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.553400 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data" (OuterVolumeSpecName: "config-data") pod "f9b76748-e694-4766-a355-d01c0fc857e0" (UID: "f9b76748-e694-4766-a355-d01c0fc857e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.620233 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fww\" (UniqueName: \"kubernetes.io/projected/f9b76748-e694-4766-a355-d01c0fc857e0-kube-api-access-45fww\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.620265 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.620274 4696 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4696]: I1203 00:01:06.620289 4696 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/f9b76748-e694-4766-a355-d01c0fc857e0-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:07 crc kubenswrapper[4696]: I1203 00:01:07.104795 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-jvcl6" event={"ID":"f9b76748-e694-4766-a355-d01c0fc857e0","Type":"ContainerDied","Data":"1c26bc10d06b27214810536e59176341f9fcd335fb756b2ef05de31f248872a1"} Dec 03 00:01:07 crc kubenswrapper[4696]: I1203 00:01:07.104847 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c26bc10d06b27214810536e59176341f9fcd335fb756b2ef05de31f248872a1" Dec 03 00:01:07 crc kubenswrapper[4696]: I1203 00:01:07.105935 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-jvcl6" Dec 03 00:01:13 crc kubenswrapper[4696]: I1203 00:01:13.433628 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:01:13 crc kubenswrapper[4696]: E1203 00:01:13.435095 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:01:28 crc kubenswrapper[4696]: I1203 00:01:28.433067 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:01:28 crc kubenswrapper[4696]: E1203 00:01:28.433947 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:01:42 crc kubenswrapper[4696]: I1203 00:01:42.431905 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:01:42 crc kubenswrapper[4696]: E1203 00:01:42.433118 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:01:53 crc kubenswrapper[4696]: I1203 00:01:53.432833 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:01:53 crc kubenswrapper[4696]: E1203 00:01:53.433956 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:02:05 crc kubenswrapper[4696]: I1203 00:02:05.432927 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:02:05 crc kubenswrapper[4696]: E1203 00:02:05.434415 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:02:17 crc kubenswrapper[4696]: I1203 00:02:17.432308 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:02:17 crc kubenswrapper[4696]: E1203 00:02:17.433867 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.468281 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:22 crc kubenswrapper[4696]: E1203 00:02:22.469443 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c211c59c-65a7-4672-8a9e-7b9d20220ef5" containerName="keystone-cron" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469556 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="c211c59c-65a7-4672-8a9e-7b9d20220ef5" containerName="keystone-cron" Dec 03 00:02:22 crc kubenswrapper[4696]: E1203 00:02:22.469574 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b76748-e694-4766-a355-d01c0fc857e0" containerName="cinder-db-purge" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469581 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b76748-e694-4766-a355-d01c0fc857e0" containerName="cinder-db-purge" Dec 03 00:02:22 crc kubenswrapper[4696]: E1203 00:02:22.469602 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" containerName="glance-dbpurge" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469611 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" containerName="glance-dbpurge" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469854 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b76748-e694-4766-a355-d01c0fc857e0" containerName="cinder-db-purge" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469885 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc2c412-c05d-4914-b5a4-e0a0e40b8a59" containerName="glance-dbpurge" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.469895 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="c211c59c-65a7-4672-8a9e-7b9d20220ef5" containerName="keystone-cron" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.471437 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.516457 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.547893 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.548006 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.548258 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxrq\" (UniqueName: \"kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.650690 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxrq\" (UniqueName: \"kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.650838 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.650892 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.651520 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.651618 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.677675 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxrq\" (UniqueName: \"kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq\") pod \"redhat-marketplace-h7fhz\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:22 crc kubenswrapper[4696]: I1203 00:02:22.808681 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:23 crc kubenswrapper[4696]: I1203 00:02:23.333401 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:23 crc kubenswrapper[4696]: E1203 00:02:23.794825 4696 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad04dc9a_83d0_48bc_bfdf_53a7baa57a60.slice/crio-conmon-f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:02:23 crc kubenswrapper[4696]: I1203 00:02:23.999487 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerID="f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb" exitCode=0 Dec 03 00:02:23 crc kubenswrapper[4696]: I1203 00:02:23.999552 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerDied","Data":"f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb"} Dec 03 00:02:24 crc kubenswrapper[4696]: I1203 00:02:23.999589 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerStarted","Data":"040d435d512c6ef410e7b74ae8cb6a35198979f7f14cb3239d20cea00d17c206"} Dec 03 00:02:25 crc kubenswrapper[4696]: I1203 00:02:25.013418 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerStarted","Data":"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e"} Dec 03 00:02:26 crc kubenswrapper[4696]: I1203 00:02:26.028489 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerID="30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e" exitCode=0 Dec 03 00:02:26 crc kubenswrapper[4696]: I1203 00:02:26.028539 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerDied","Data":"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e"} Dec 03 00:02:27 crc kubenswrapper[4696]: I1203 00:02:27.042168 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerStarted","Data":"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2"} Dec 03 00:02:27 crc kubenswrapper[4696]: I1203 00:02:27.067308 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7fhz" podStartSLOduration=2.669898027 podStartE2EDuration="5.067284431s" podCreationTimestamp="2025-12-03 00:02:22 +0000 UTC" firstStartedPulling="2025-12-03 00:02:24.001998445 +0000 UTC m=+4806.882678446" lastFinishedPulling="2025-12-03 00:02:26.399384819 +0000 UTC m=+4809.280064850" observedRunningTime="2025-12-03 00:02:27.060061967 +0000 UTC m=+4809.940741968" watchObservedRunningTime="2025-12-03 00:02:27.067284431 +0000 UTC m=+4809.947964432" Dec 03 00:02:29 crc kubenswrapper[4696]: I1203 00:02:29.431929 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:02:29 crc kubenswrapper[4696]: E1203 00:02:29.433116 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:02:32 crc kubenswrapper[4696]: I1203 00:02:32.809731 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:32 crc kubenswrapper[4696]: I1203 00:02:32.811080 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:32 crc kubenswrapper[4696]: I1203 00:02:32.887766 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:33 crc kubenswrapper[4696]: I1203 00:02:33.179869 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:33 crc kubenswrapper[4696]: I1203 00:02:33.240148 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:35 crc kubenswrapper[4696]: I1203 00:02:35.150374 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7fhz" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="registry-server" containerID="cri-o://35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2" gracePeriod=2 Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.128912 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.164458 4696 generic.go:334] "Generic (PLEG): container finished" podID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerID="35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2" exitCode=0 Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.164514 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerDied","Data":"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2"} Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.164553 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7fhz" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.164583 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7fhz" event={"ID":"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60","Type":"ContainerDied","Data":"040d435d512c6ef410e7b74ae8cb6a35198979f7f14cb3239d20cea00d17c206"} Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.164614 4696 scope.go:117] "RemoveContainer" containerID="35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.197499 4696 scope.go:117] "RemoveContainer" containerID="30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.230629 4696 scope.go:117] "RemoveContainer" containerID="f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.254301 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities\") pod \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.254689 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content\") pod \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.254805 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxrq\" (UniqueName: \"kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq\") pod \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\" (UID: \"ad04dc9a-83d0-48bc-bfdf-53a7baa57a60\") " Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.255500 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities" (OuterVolumeSpecName: "utilities") pod "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" (UID: "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.256123 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.273110 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq" (OuterVolumeSpecName: "kube-api-access-8xxrq") pod "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" (UID: "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60"). InnerVolumeSpecName "kube-api-access-8xxrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.284293 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" (UID: "ad04dc9a-83d0-48bc-bfdf-53a7baa57a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.286188 4696 scope.go:117] "RemoveContainer" containerID="35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2" Dec 03 00:02:36 crc kubenswrapper[4696]: E1203 00:02:36.287127 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2\": container with ID starting with 35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2 not found: ID does not exist" containerID="35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.287181 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2"} err="failed to get container status \"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2\": rpc error: code = NotFound desc = could not find container \"35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2\": container with ID starting with 35fa8a34dcc654b8f3edaa70e380f521138e7e9729a86894c5bc35ab66fb20f2 not found: ID does not exist" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.287221 4696 scope.go:117] "RemoveContainer" containerID="30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e" Dec 03 00:02:36 crc kubenswrapper[4696]: E1203 00:02:36.287872 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e\": container with ID starting with 30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e not found: ID does not exist" containerID="30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.287917 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e"} err="failed to get container status \"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e\": rpc error: code = NotFound desc = could not find container \"30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e\": container with ID starting with 30da0046d26c88d049a4524800ed5f9364cbe766711066e72e2ecbba42b6d37e not found: ID does not exist" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.287937 4696 scope.go:117] "RemoveContainer" containerID="f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb" Dec 03 00:02:36 crc kubenswrapper[4696]: E1203 00:02:36.288391 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb\": container with ID starting with f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb not found: ID does not exist" containerID="f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.288460 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb"} err="failed to get container status \"f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb\": rpc error: code = NotFound desc = could not find container \"f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb\": container with ID starting with f86e787267707b2f12008fcf03860ff9261da7c51d8d0b023ecd38e297f66deb not found: ID does not exist" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.358600 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.358858 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxrq\" (UniqueName: \"kubernetes.io/projected/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60-kube-api-access-8xxrq\") on node \"crc\" DevicePath \"\"" Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.509278 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:36 crc kubenswrapper[4696]: I1203 00:02:36.518597 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7fhz"] Dec 03 00:02:37 crc kubenswrapper[4696]: I1203 00:02:37.442599 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" path="/var/lib/kubelet/pods/ad04dc9a-83d0-48bc-bfdf-53a7baa57a60/volumes" Dec 03 00:02:41 crc kubenswrapper[4696]: I1203 00:02:41.432976 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:02:41 crc kubenswrapper[4696]: E1203 00:02:41.434232 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:02:54 crc kubenswrapper[4696]: I1203 00:02:54.432264 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:02:54 crc kubenswrapper[4696]: E1203 00:02:54.435355 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:03:06 crc kubenswrapper[4696]: I1203 00:03:06.432606 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:03:06 crc kubenswrapper[4696]: E1203 00:03:06.433679 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:03:20 crc kubenswrapper[4696]: I1203 00:03:20.432005 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:03:20 crc kubenswrapper[4696]: E1203 00:03:20.433197 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:03:35 crc kubenswrapper[4696]: I1203 00:03:35.432275 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:03:36 crc kubenswrapper[4696]: I1203 00:03:36.849680 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf"} Dec 03 00:05:52 crc kubenswrapper[4696]: I1203 00:05:52.974268 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:05:52 crc kubenswrapper[4696]: I1203 00:05:52.975284 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:06:22 crc kubenswrapper[4696]: I1203 00:06:22.974605 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:06:22 crc kubenswrapper[4696]: I1203 00:06:22.975633 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:06:52 crc kubenswrapper[4696]: I1203 00:06:52.974047 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:06:52 crc kubenswrapper[4696]: I1203 00:06:52.974952 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:06:52 crc kubenswrapper[4696]: I1203 00:06:52.975029 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 03 00:06:52 crc kubenswrapper[4696]: I1203 00:06:52.976085 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:06:52 crc kubenswrapper[4696]: I1203 00:06:52.976185 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf" gracePeriod=600 Dec 03 00:06:54 crc kubenswrapper[4696]: I1203 00:06:54.147435 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf" exitCode=0 Dec 03 00:06:54 crc kubenswrapper[4696]: I1203 00:06:54.147531 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf"} Dec 03 00:06:54 crc kubenswrapper[4696]: I1203 00:06:54.147924 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048"} Dec 03 00:06:54 crc kubenswrapper[4696]: I1203 00:06:54.147957 4696 scope.go:117] "RemoveContainer" containerID="4e9c28618ad46c6b1bc6c5edf0e736f35d65060ade5cc08206e332e18207549b" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.312354 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:07 crc kubenswrapper[4696]: E1203 00:09:07.313807 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="extract-content" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.313836 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="extract-content" Dec 03 00:09:07 crc kubenswrapper[4696]: E1203 00:09:07.313872 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="registry-server" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.313881 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="registry-server" Dec 03 00:09:07 crc kubenswrapper[4696]: E1203 00:09:07.313899 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="extract-utilities" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.313906 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="extract-utilities" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.314190 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad04dc9a-83d0-48bc-bfdf-53a7baa57a60" containerName="registry-server" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.316030 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.332312 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.423901 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.424023 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.424055 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65f9\" (UniqueName: \"kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.526325 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.527186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.527614 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.527926 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b65f9\" (UniqueName: \"kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.528295 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.554620 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65f9\" (UniqueName: \"kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9\") pod \"certified-operators-zj27z\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:07 crc kubenswrapper[4696]: I1203 00:09:07.654451 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:08 crc kubenswrapper[4696]: I1203 00:09:08.271366 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:08 crc kubenswrapper[4696]: I1203 00:09:08.597465 4696 generic.go:334] "Generic (PLEG): container finished" podID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerID="846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646" exitCode=0 Dec 03 00:09:08 crc kubenswrapper[4696]: I1203 00:09:08.597584 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerDied","Data":"846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646"} Dec 03 00:09:08 crc kubenswrapper[4696]: I1203 00:09:08.597970 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerStarted","Data":"56e4ac84312a5111fbfcb632f09902f33ebed907f2874a8ad108a437ad65dce2"} Dec 03 00:09:08 crc kubenswrapper[4696]: I1203 00:09:08.600297 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:09:09 crc kubenswrapper[4696]: I1203 00:09:09.613540 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerStarted","Data":"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516"} Dec 03 00:09:11 crc kubenswrapper[4696]: I1203 00:09:11.639200 4696 generic.go:334] "Generic (PLEG): container finished" podID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerID="03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516" exitCode=0 Dec 03 00:09:11 crc kubenswrapper[4696]: I1203 00:09:11.639335 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerDied","Data":"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516"} Dec 03 00:09:12 crc kubenswrapper[4696]: I1203 00:09:12.653318 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerStarted","Data":"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab"} Dec 03 00:09:12 crc kubenswrapper[4696]: I1203 00:09:12.683548 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zj27z" podStartSLOduration=2.1987379049999998 podStartE2EDuration="5.683520816s" podCreationTimestamp="2025-12-03 00:09:07 +0000 UTC" firstStartedPulling="2025-12-03 00:09:08.600011528 +0000 UTC m=+5211.480691529" lastFinishedPulling="2025-12-03 00:09:12.084794439 +0000 UTC m=+5214.965474440" observedRunningTime="2025-12-03 00:09:12.680037958 +0000 UTC m=+5215.560717959" watchObservedRunningTime="2025-12-03 00:09:12.683520816 +0000 UTC m=+5215.564200817" Dec 03 00:09:17 crc kubenswrapper[4696]: I1203 00:09:17.654954 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:17 crc kubenswrapper[4696]: I1203 00:09:17.655562 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:17 crc kubenswrapper[4696]: I1203 00:09:17.821874 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:17 crc kubenswrapper[4696]: I1203 00:09:17.888228 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:18 crc kubenswrapper[4696]: I1203 00:09:18.068242 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:19 crc kubenswrapper[4696]: I1203 00:09:19.739225 4696 generic.go:334] "Generic (PLEG): container finished" podID="4881d1aa-7494-45fe-b21b-5cae7bfe2f41" containerID="60700ed9991efb7d98d1cd75467245e2863222c007e4d6ba953c590fd95638f0" exitCode=0 Dec 03 00:09:19 crc kubenswrapper[4696]: I1203 00:09:19.739804 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4881d1aa-7494-45fe-b21b-5cae7bfe2f41","Type":"ContainerDied","Data":"60700ed9991efb7d98d1cd75467245e2863222c007e4d6ba953c590fd95638f0"} Dec 03 00:09:19 crc kubenswrapper[4696]: I1203 00:09:19.740005 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zj27z" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="registry-server" containerID="cri-o://0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab" gracePeriod=2 Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.266452 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.346435 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content\") pod \"f9f17db7-316c-4f33-870e-ebbb5543cf01\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.346708 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b65f9\" (UniqueName: \"kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9\") pod \"f9f17db7-316c-4f33-870e-ebbb5543cf01\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.346845 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities\") pod \"f9f17db7-316c-4f33-870e-ebbb5543cf01\" (UID: \"f9f17db7-316c-4f33-870e-ebbb5543cf01\") " Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.348253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities" (OuterVolumeSpecName: "utilities") pod "f9f17db7-316c-4f33-870e-ebbb5543cf01" (UID: "f9f17db7-316c-4f33-870e-ebbb5543cf01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.349296 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.361106 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9" (OuterVolumeSpecName: "kube-api-access-b65f9") pod "f9f17db7-316c-4f33-870e-ebbb5543cf01" (UID: "f9f17db7-316c-4f33-870e-ebbb5543cf01"). InnerVolumeSpecName "kube-api-access-b65f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.403181 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9f17db7-316c-4f33-870e-ebbb5543cf01" (UID: "f9f17db7-316c-4f33-870e-ebbb5543cf01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.451629 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b65f9\" (UniqueName: \"kubernetes.io/projected/f9f17db7-316c-4f33-870e-ebbb5543cf01-kube-api-access-b65f9\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.451679 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f17db7-316c-4f33-870e-ebbb5543cf01-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.770469 4696 generic.go:334] "Generic (PLEG): container finished" podID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerID="0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab" exitCode=0 Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.770654 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerDied","Data":"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab"} Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.770700 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj27z" event={"ID":"f9f17db7-316c-4f33-870e-ebbb5543cf01","Type":"ContainerDied","Data":"56e4ac84312a5111fbfcb632f09902f33ebed907f2874a8ad108a437ad65dce2"} Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.770726 4696 scope.go:117] "RemoveContainer" containerID="0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.770955 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj27z" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.815457 4696 scope.go:117] "RemoveContainer" containerID="03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.825598 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.834266 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zj27z"] Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.862529 4696 scope.go:117] "RemoveContainer" containerID="846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.903953 4696 scope.go:117] "RemoveContainer" containerID="0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab" Dec 03 00:09:20 crc kubenswrapper[4696]: E1203 00:09:20.905617 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab\": container with ID starting with 0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab not found: ID does not exist" containerID="0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.905659 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab"} err="failed to get container status \"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab\": rpc error: code = NotFound desc = could not find container \"0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab\": container with ID starting with 0065554f0686c9d684466e26954370c2107a430b9f36217c1177e5af0313aeab not found: ID does not exist" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.905688 4696 scope.go:117] "RemoveContainer" containerID="03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516" Dec 03 00:09:20 crc kubenswrapper[4696]: E1203 00:09:20.906065 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516\": container with ID starting with 03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516 not found: ID does not exist" containerID="03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.906103 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516"} err="failed to get container status \"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516\": rpc error: code = NotFound desc = could not find container \"03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516\": container with ID starting with 03fe02c62dd4eff4e34b8b50cf66b4a0c377ac8e2113ae2a3424ff1fd687b516 not found: ID does not exist" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.906124 4696 scope.go:117] "RemoveContainer" containerID="846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646" Dec 03 00:09:20 crc kubenswrapper[4696]: E1203 00:09:20.906333 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646\": container with ID starting with 846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646 not found: ID does not exist" containerID="846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646" Dec 03 00:09:20 crc kubenswrapper[4696]: I1203 00:09:20.906363 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646"} err="failed to get container status \"846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646\": rpc error: code = NotFound desc = could not find container \"846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646\": container with ID starting with 846e878c3898b5159388624c95bc3cbe0f696cd7f6ffe78ed9df9e28383b7646 not found: ID does not exist" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.220984 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.372337 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.372443 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.372624 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.372927 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.372963 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.373018 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hls5b\" (UniqueName: \"kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.373058 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.373185 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.373264 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data\") pod \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\" (UID: \"4881d1aa-7494-45fe-b21b-5cae7bfe2f41\") " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.373580 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.374095 4696 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.375184 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data" (OuterVolumeSpecName: "config-data") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.382255 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.390374 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b" (OuterVolumeSpecName: "kube-api-access-hls5b") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "kube-api-access-hls5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.406534 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.407493 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.411217 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.443050 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.447871 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" path="/var/lib/kubelet/pods/f9f17db7-316c-4f33-870e-ebbb5543cf01/volumes" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.468498 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4881d1aa-7494-45fe-b21b-5cae7bfe2f41" (UID: "4881d1aa-7494-45fe-b21b-5cae7bfe2f41"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.476976 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477033 4696 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477045 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hls5b\" (UniqueName: \"kubernetes.io/projected/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-kube-api-access-hls5b\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477059 4696 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477124 4696 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477140 4696 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477150 4696 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.477182 4696 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4881d1aa-7494-45fe-b21b-5cae7bfe2f41-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.504899 4696 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.579856 4696 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.781846 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4881d1aa-7494-45fe-b21b-5cae7bfe2f41","Type":"ContainerDied","Data":"e80116cf4964c80bd3f3d2af6576647888f0966a6ae74ef91fa217d9c1200079"} Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.782570 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80116cf4964c80bd3f3d2af6576647888f0966a6ae74ef91fa217d9c1200079" Dec 03 00:09:21 crc kubenswrapper[4696]: I1203 00:09:21.781858 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 00:09:22 crc kubenswrapper[4696]: I1203 00:09:22.973710 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:09:22 crc kubenswrapper[4696]: I1203 00:09:22.973893 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.326514 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:09:28 crc kubenswrapper[4696]: E1203 00:09:28.327587 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4881d1aa-7494-45fe-b21b-5cae7bfe2f41" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.327605 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="4881d1aa-7494-45fe-b21b-5cae7bfe2f41" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:09:28 crc kubenswrapper[4696]: E1203 00:09:28.327622 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="extract-utilities" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.327629 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="extract-utilities" Dec 03 00:09:28 crc kubenswrapper[4696]: E1203 00:09:28.327661 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="extract-content" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.327707 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="extract-content" Dec 03 00:09:28 crc kubenswrapper[4696]: E1203 00:09:28.327727 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="registry-server" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.327733 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="registry-server" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.332621 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f17db7-316c-4f33-870e-ebbb5543cf01" containerName="registry-server" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.332721 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="4881d1aa-7494-45fe-b21b-5cae7bfe2f41" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.334078 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.349001 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w6blf" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.367668 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.437978 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4vjh\" (UniqueName: \"kubernetes.io/projected/26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88-kube-api-access-p4vjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.438147 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.539907 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4vjh\" (UniqueName: \"kubernetes.io/projected/26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88-kube-api-access-p4vjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.540098 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.540687 4696 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.565877 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4vjh\" (UniqueName: \"kubernetes.io/projected/26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88-kube-api-access-p4vjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.569893 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:28 crc kubenswrapper[4696]: I1203 00:09:28.673895 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:09:29 crc kubenswrapper[4696]: I1203 00:09:29.803347 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:09:29 crc kubenswrapper[4696]: I1203 00:09:29.867161 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88","Type":"ContainerStarted","Data":"3e6ec315d2cb356e6c9fde98d4240f8a08f9e56d59464627bd55942155ab4d90"} Dec 03 00:09:34 crc kubenswrapper[4696]: I1203 00:09:34.160401 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88","Type":"ContainerStarted","Data":"c800403369b6468fbf8e724fc81a95b63f400612baa9a48637a178c36d3c07c7"} Dec 03 00:09:34 crc kubenswrapper[4696]: I1203 00:09:34.186914 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.318112367 podStartE2EDuration="6.186886443s" podCreationTimestamp="2025-12-03 00:09:28 +0000 UTC" firstStartedPulling="2025-12-03 00:09:29.811776205 +0000 UTC m=+5232.692456206" lastFinishedPulling="2025-12-03 00:09:32.680550291 +0000 UTC m=+5235.561230282" observedRunningTime="2025-12-03 00:09:34.179622358 +0000 UTC m=+5237.060302359" watchObservedRunningTime="2025-12-03 00:09:34.186886443 +0000 UTC m=+5237.067566444" Dec 03 00:09:52 crc kubenswrapper[4696]: I1203 00:09:52.973643 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:09:52 crc kubenswrapper[4696]: I1203 00:09:52.974504 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.486586 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6s7rp/must-gather-9r5nt"] Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.490331 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.493156 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6s7rp"/"openshift-service-ca.crt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.493761 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6s7rp"/"kube-root-ca.crt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.495444 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6s7rp"/"default-dockercfg-pbcwn" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.509672 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6s7rp/must-gather-9r5nt"] Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.558597 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9wq\" (UniqueName: \"kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.560082 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.662696 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9wq\" (UniqueName: \"kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.662900 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.663382 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.682820 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9wq\" (UniqueName: \"kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq\") pod \"must-gather-9r5nt\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:00 crc kubenswrapper[4696]: I1203 00:10:00.818072 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:10:01 crc kubenswrapper[4696]: I1203 00:10:01.302683 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6s7rp/must-gather-9r5nt"] Dec 03 00:10:01 crc kubenswrapper[4696]: I1203 00:10:01.448356 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" event={"ID":"f800a297-5f59-4fae-9a3a-b326cc8a29e4","Type":"ContainerStarted","Data":"664d065d70561d5e8077ac91b2934bc5b0ae6ff468c9cea3f3738efac3cfa92f"} Dec 03 00:10:11 crc kubenswrapper[4696]: I1203 00:10:11.565782 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" event={"ID":"f800a297-5f59-4fae-9a3a-b326cc8a29e4","Type":"ContainerStarted","Data":"73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108"} Dec 03 00:10:11 crc kubenswrapper[4696]: I1203 00:10:11.566673 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" event={"ID":"f800a297-5f59-4fae-9a3a-b326cc8a29e4","Type":"ContainerStarted","Data":"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116"} Dec 03 00:10:11 crc kubenswrapper[4696]: I1203 00:10:11.590047 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" podStartSLOduration=2.451040656 podStartE2EDuration="11.590019978s" podCreationTimestamp="2025-12-03 00:10:00 +0000 UTC" firstStartedPulling="2025-12-03 00:10:01.308983948 +0000 UTC m=+5264.189663949" lastFinishedPulling="2025-12-03 00:10:10.44796327 +0000 UTC m=+5273.328643271" observedRunningTime="2025-12-03 00:10:11.583293127 +0000 UTC m=+5274.463973128" watchObservedRunningTime="2025-12-03 00:10:11.590019978 +0000 UTC m=+5274.470699979" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.202920 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-jbstx"] Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.205591 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.317844 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2zv\" (UniqueName: \"kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.317899 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.420207 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2zv\" (UniqueName: \"kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.420929 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.421021 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.442051 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2zv\" (UniqueName: \"kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv\") pod \"crc-debug-jbstx\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.538717 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:10:15 crc kubenswrapper[4696]: I1203 00:10:15.611362 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" event={"ID":"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539","Type":"ContainerStarted","Data":"24f0eb92a56289aac6d202875047c8940e351d0339934d03a71fe4243b3221a5"} Dec 03 00:10:22 crc kubenswrapper[4696]: I1203 00:10:22.973966 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:10:22 crc kubenswrapper[4696]: I1203 00:10:22.974764 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:10:22 crc kubenswrapper[4696]: I1203 00:10:22.974827 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 03 00:10:22 crc kubenswrapper[4696]: I1203 00:10:22.975882 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:10:22 crc kubenswrapper[4696]: I1203 00:10:22.975942 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" gracePeriod=600 Dec 03 00:10:23 crc kubenswrapper[4696]: I1203 00:10:23.714475 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" exitCode=0 Dec 03 00:10:23 crc kubenswrapper[4696]: I1203 00:10:23.714559 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048"} Dec 03 00:10:23 crc kubenswrapper[4696]: I1203 00:10:23.715097 4696 scope.go:117] "RemoveContainer" containerID="9d5f9a1c0849341a66480d386db6fcfb4622b7d47a5ae725775e9e060adb3fbf" Dec 03 00:10:26 crc kubenswrapper[4696]: E1203 00:10:26.763329 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:10:27 crc kubenswrapper[4696]: I1203 00:10:27.803802 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:10:27 crc kubenswrapper[4696]: E1203 00:10:27.805501 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:10:31 crc kubenswrapper[4696]: I1203 00:10:31.844488 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" event={"ID":"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539","Type":"ContainerStarted","Data":"2b3551ec7ec099667aa393d2049da789b67c5101d1bfb9550b63a3419842db66"} Dec 03 00:10:32 crc kubenswrapper[4696]: I1203 00:10:32.891625 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" podStartSLOduration=4.159562701 podStartE2EDuration="17.891600285s" podCreationTimestamp="2025-12-03 00:10:15 +0000 UTC" firstStartedPulling="2025-12-03 00:10:15.580755621 +0000 UTC m=+5278.461435622" lastFinishedPulling="2025-12-03 00:10:29.312793205 +0000 UTC m=+5292.193473206" observedRunningTime="2025-12-03 00:10:32.881797118 +0000 UTC m=+5295.762477119" watchObservedRunningTime="2025-12-03 00:10:32.891600285 +0000 UTC m=+5295.772280286" Dec 03 00:10:40 crc kubenswrapper[4696]: I1203 00:10:40.431485 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:10:40 crc kubenswrapper[4696]: E1203 00:10:40.432442 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:10:55 crc kubenswrapper[4696]: I1203 00:10:55.433200 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:10:55 crc kubenswrapper[4696]: E1203 00:10:55.434342 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:11:07 crc kubenswrapper[4696]: I1203 00:11:07.440595 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:11:07 crc kubenswrapper[4696]: E1203 00:11:07.441568 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.750327 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.753611 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.769048 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.847106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.847191 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nf9\" (UniqueName: \"kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.847678 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.950086 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nf9\" (UniqueName: \"kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.950226 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.950307 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.950828 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.951423 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:12 crc kubenswrapper[4696]: I1203 00:11:12.983934 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nf9\" (UniqueName: \"kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9\") pod \"community-operators-5bkfl\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:13 crc kubenswrapper[4696]: I1203 00:11:13.095111 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:13 crc kubenswrapper[4696]: I1203 00:11:13.697986 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:14 crc kubenswrapper[4696]: I1203 00:11:14.346210 4696 generic.go:334] "Generic (PLEG): container finished" podID="721c91fd-5d68-4627-b723-34591ff2e704" containerID="c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25" exitCode=0 Dec 03 00:11:14 crc kubenswrapper[4696]: I1203 00:11:14.346296 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerDied","Data":"c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25"} Dec 03 00:11:14 crc kubenswrapper[4696]: I1203 00:11:14.346653 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerStarted","Data":"db67b555488f50c57c1afa4d5cb1654b94c71c46b642b22c9af68514871865f3"} Dec 03 00:11:16 crc kubenswrapper[4696]: I1203 00:11:16.376486 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerStarted","Data":"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040"} Dec 03 00:11:20 crc kubenswrapper[4696]: I1203 00:11:20.418241 4696 generic.go:334] "Generic (PLEG): container finished" podID="721c91fd-5d68-4627-b723-34591ff2e704" containerID="c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040" exitCode=0 Dec 03 00:11:20 crc kubenswrapper[4696]: I1203 00:11:20.418489 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerDied","Data":"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040"} Dec 03 00:11:21 crc kubenswrapper[4696]: I1203 00:11:21.431248 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:11:21 crc kubenswrapper[4696]: E1203 00:11:21.431825 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:11:21 crc kubenswrapper[4696]: I1203 00:11:21.441484 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerStarted","Data":"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc"} Dec 03 00:11:21 crc kubenswrapper[4696]: I1203 00:11:21.474030 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5bkfl" podStartSLOduration=2.820328721 podStartE2EDuration="9.474006336s" podCreationTimestamp="2025-12-03 00:11:12 +0000 UTC" firstStartedPulling="2025-12-03 00:11:14.348411951 +0000 UTC m=+5337.229091952" lastFinishedPulling="2025-12-03 00:11:21.002089556 +0000 UTC m=+5343.882769567" observedRunningTime="2025-12-03 00:11:21.464333002 +0000 UTC m=+5344.345012993" watchObservedRunningTime="2025-12-03 00:11:21.474006336 +0000 UTC m=+5344.354686327" Dec 03 00:11:23 crc kubenswrapper[4696]: I1203 00:11:23.095330 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:23 crc kubenswrapper[4696]: I1203 00:11:23.095681 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:24 crc kubenswrapper[4696]: I1203 00:11:24.147768 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5bkfl" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="registry-server" probeResult="failure" output=< Dec 03 00:11:24 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 03 00:11:24 crc kubenswrapper[4696]: > Dec 03 00:11:27 crc kubenswrapper[4696]: I1203 00:11:27.508299 4696 generic.go:334] "Generic (PLEG): container finished" podID="9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" containerID="2b3551ec7ec099667aa393d2049da789b67c5101d1bfb9550b63a3419842db66" exitCode=0 Dec 03 00:11:27 crc kubenswrapper[4696]: I1203 00:11:27.508395 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" event={"ID":"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539","Type":"ContainerDied","Data":"2b3551ec7ec099667aa393d2049da789b67c5101d1bfb9550b63a3419842db66"} Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.633292 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.676000 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-jbstx"] Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.687154 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-jbstx"] Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.834487 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host\") pod \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.834648 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2zv\" (UniqueName: \"kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv\") pod \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\" (UID: \"9c9cb066-72fe-4ac0-9bef-e4a8c37d6539\") " Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.834771 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host" (OuterVolumeSpecName: "host") pod "9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" (UID: "9c9cb066-72fe-4ac0-9bef-e4a8c37d6539"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.835265 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.842052 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv" (OuterVolumeSpecName: "kube-api-access-xl2zv") pod "9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" (UID: "9c9cb066-72fe-4ac0-9bef-e4a8c37d6539"). InnerVolumeSpecName "kube-api-access-xl2zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:11:28 crc kubenswrapper[4696]: I1203 00:11:28.936857 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl2zv\" (UniqueName: \"kubernetes.io/projected/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539-kube-api-access-xl2zv\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.446118 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" path="/var/lib/kubelet/pods/9c9cb066-72fe-4ac0-9bef-e4a8c37d6539/volumes" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.534884 4696 scope.go:117] "RemoveContainer" containerID="2b3551ec7ec099667aa393d2049da789b67c5101d1bfb9550b63a3419842db66" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.534947 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-jbstx" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.878220 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-xqchg"] Dec 03 00:11:29 crc kubenswrapper[4696]: E1203 00:11:29.878728 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" containerName="container-00" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.878763 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" containerName="container-00" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.879036 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9cb066-72fe-4ac0-9bef-e4a8c37d6539" containerName="container-00" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.879793 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.983929 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:29 crc kubenswrapper[4696]: I1203 00:11:29.986133 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqkp\" (UniqueName: \"kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.094299 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqkp\" (UniqueName: \"kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.094444 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.094574 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.127719 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqkp\" (UniqueName: \"kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp\") pod \"crc-debug-xqchg\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.201945 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.552133 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" event={"ID":"22330be3-9693-4a9d-90e9-7fce4210d47b","Type":"ContainerStarted","Data":"720a12430aa7ec09e3e7ece4379704a8607d4a03fcd5a6572a84f519ae961b2b"} Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.552572 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" event={"ID":"22330be3-9693-4a9d-90e9-7fce4210d47b","Type":"ContainerStarted","Data":"881f3c6112c6e0758e612ff77afc7903c7235cd012c481d0738f90bf9ac555d7"} Dec 03 00:11:30 crc kubenswrapper[4696]: I1203 00:11:30.575247 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" podStartSLOduration=1.575219219 podStartE2EDuration="1.575219219s" podCreationTimestamp="2025-12-03 00:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:11:30.570686151 +0000 UTC m=+5353.451366152" watchObservedRunningTime="2025-12-03 00:11:30.575219219 +0000 UTC m=+5353.455899210" Dec 03 00:11:31 crc kubenswrapper[4696]: I1203 00:11:31.570295 4696 generic.go:334] "Generic (PLEG): container finished" podID="22330be3-9693-4a9d-90e9-7fce4210d47b" containerID="720a12430aa7ec09e3e7ece4379704a8607d4a03fcd5a6572a84f519ae961b2b" exitCode=0 Dec 03 00:11:31 crc kubenswrapper[4696]: I1203 00:11:31.570362 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" event={"ID":"22330be3-9693-4a9d-90e9-7fce4210d47b","Type":"ContainerDied","Data":"720a12430aa7ec09e3e7ece4379704a8607d4a03fcd5a6572a84f519ae961b2b"} Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.693330 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.740677 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host\") pod \"22330be3-9693-4a9d-90e9-7fce4210d47b\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.740851 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsqkp\" (UniqueName: \"kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp\") pod \"22330be3-9693-4a9d-90e9-7fce4210d47b\" (UID: \"22330be3-9693-4a9d-90e9-7fce4210d47b\") " Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.740909 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host" (OuterVolumeSpecName: "host") pod "22330be3-9693-4a9d-90e9-7fce4210d47b" (UID: "22330be3-9693-4a9d-90e9-7fce4210d47b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.741486 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22330be3-9693-4a9d-90e9-7fce4210d47b-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.752915 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp" (OuterVolumeSpecName: "kube-api-access-vsqkp") pod "22330be3-9693-4a9d-90e9-7fce4210d47b" (UID: "22330be3-9693-4a9d-90e9-7fce4210d47b"). InnerVolumeSpecName "kube-api-access-vsqkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:11:32 crc kubenswrapper[4696]: I1203 00:11:32.842909 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsqkp\" (UniqueName: \"kubernetes.io/projected/22330be3-9693-4a9d-90e9-7fce4210d47b-kube-api-access-vsqkp\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.175449 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.228679 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.468724 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.592874 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.593419 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-xqchg" event={"ID":"22330be3-9693-4a9d-90e9-7fce4210d47b","Type":"ContainerDied","Data":"881f3c6112c6e0758e612ff77afc7903c7235cd012c481d0738f90bf9ac555d7"} Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.593465 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="881f3c6112c6e0758e612ff77afc7903c7235cd012c481d0738f90bf9ac555d7" Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.795430 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-xqchg"] Dec 03 00:11:33 crc kubenswrapper[4696]: I1203 00:11:33.805216 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-xqchg"] Dec 03 00:11:34 crc kubenswrapper[4696]: I1203 00:11:34.601542 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5bkfl" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="registry-server" containerID="cri-o://3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc" gracePeriod=2 Dec 03 00:11:34 crc kubenswrapper[4696]: I1203 00:11:34.969679 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-cqq59"] Dec 03 00:11:34 crc kubenswrapper[4696]: E1203 00:11:34.971173 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22330be3-9693-4a9d-90e9-7fce4210d47b" containerName="container-00" Dec 03 00:11:34 crc kubenswrapper[4696]: I1203 00:11:34.971198 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="22330be3-9693-4a9d-90e9-7fce4210d47b" containerName="container-00" Dec 03 00:11:34 crc kubenswrapper[4696]: I1203 00:11:34.971475 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="22330be3-9693-4a9d-90e9-7fce4210d47b" containerName="container-00" Dec 03 00:11:34 crc kubenswrapper[4696]: I1203 00:11:34.972287 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.095903 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.096446 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxlt\" (UniqueName: \"kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.199312 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxlt\" (UniqueName: \"kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.199456 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.199635 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.226816 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxlt\" (UniqueName: \"kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt\") pod \"crc-debug-cqq59\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.295201 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:35 crc kubenswrapper[4696]: W1203 00:11:35.332425 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9256144a_07ee_4413_a5f4_dcf196be9171.slice/crio-6203be5ea42d3a91bd2b951139af3b9d8f1d4131f9ecba162377bf35a70dd08a WatchSource:0}: Error finding container 6203be5ea42d3a91bd2b951139af3b9d8f1d4131f9ecba162377bf35a70dd08a: Status 404 returned error can't find the container with id 6203be5ea42d3a91bd2b951139af3b9d8f1d4131f9ecba162377bf35a70dd08a Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.459795 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22330be3-9693-4a9d-90e9-7fce4210d47b" path="/var/lib/kubelet/pods/22330be3-9693-4a9d-90e9-7fce4210d47b/volumes" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.544848 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.615884 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" event={"ID":"9256144a-07ee-4413-a5f4-dcf196be9171","Type":"ContainerStarted","Data":"6203be5ea42d3a91bd2b951139af3b9d8f1d4131f9ecba162377bf35a70dd08a"} Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.619801 4696 generic.go:334] "Generic (PLEG): container finished" podID="721c91fd-5d68-4627-b723-34591ff2e704" containerID="3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc" exitCode=0 Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.619858 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerDied","Data":"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc"} Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.619893 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkfl" event={"ID":"721c91fd-5d68-4627-b723-34591ff2e704","Type":"ContainerDied","Data":"db67b555488f50c57c1afa4d5cb1654b94c71c46b642b22c9af68514871865f3"} Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.619897 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkfl" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.619912 4696 scope.go:117] "RemoveContainer" containerID="3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.645888 4696 scope.go:117] "RemoveContainer" containerID="c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.669305 4696 scope.go:117] "RemoveContainer" containerID="c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.688848 4696 scope.go:117] "RemoveContainer" containerID="3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc" Dec 03 00:11:35 crc kubenswrapper[4696]: E1203 00:11:35.689639 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc\": container with ID starting with 3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc not found: ID does not exist" containerID="3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.689713 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc"} err="failed to get container status \"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc\": rpc error: code = NotFound desc = could not find container \"3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc\": container with ID starting with 3e3cf2a1e2d758507a74abeb18070ce41f70f2128f91af4826c64ca9de99dedc not found: ID does not exist" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.689798 4696 scope.go:117] "RemoveContainer" containerID="c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040" Dec 03 00:11:35 crc kubenswrapper[4696]: E1203 00:11:35.690343 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040\": container with ID starting with c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040 not found: ID does not exist" containerID="c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.690404 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040"} err="failed to get container status \"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040\": rpc error: code = NotFound desc = could not find container \"c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040\": container with ID starting with c69a0f128f1c713fe95cebf493632965fcddbea86de1999d223fa3f9f68d7040 not found: ID does not exist" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.690438 4696 scope.go:117] "RemoveContainer" containerID="c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25" Dec 03 00:11:35 crc kubenswrapper[4696]: E1203 00:11:35.691078 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25\": container with ID starting with c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25 not found: ID does not exist" containerID="c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.691113 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25"} err="failed to get container status \"c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25\": rpc error: code = NotFound desc = could not find container \"c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25\": container with ID starting with c5b81d90daf2163443178f1dc9a1d6ba558a9019121d974af605d266946ccb25 not found: ID does not exist" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.711914 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities\") pod \"721c91fd-5d68-4627-b723-34591ff2e704\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.712035 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6nf9\" (UniqueName: \"kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9\") pod \"721c91fd-5d68-4627-b723-34591ff2e704\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.712240 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content\") pod \"721c91fd-5d68-4627-b723-34591ff2e704\" (UID: \"721c91fd-5d68-4627-b723-34591ff2e704\") " Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.714634 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities" (OuterVolumeSpecName: "utilities") pod "721c91fd-5d68-4627-b723-34591ff2e704" (UID: "721c91fd-5d68-4627-b723-34591ff2e704"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.721025 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9" (OuterVolumeSpecName: "kube-api-access-v6nf9") pod "721c91fd-5d68-4627-b723-34591ff2e704" (UID: "721c91fd-5d68-4627-b723-34591ff2e704"). InnerVolumeSpecName "kube-api-access-v6nf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.780215 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "721c91fd-5d68-4627-b723-34591ff2e704" (UID: "721c91fd-5d68-4627-b723-34591ff2e704"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.815053 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6nf9\" (UniqueName: \"kubernetes.io/projected/721c91fd-5d68-4627-b723-34591ff2e704-kube-api-access-v6nf9\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.815094 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.815104 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/721c91fd-5d68-4627-b723-34591ff2e704-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.953337 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:35 crc kubenswrapper[4696]: I1203 00:11:35.963482 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5bkfl"] Dec 03 00:11:36 crc kubenswrapper[4696]: I1203 00:11:36.431996 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:11:36 crc kubenswrapper[4696]: E1203 00:11:36.432906 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:11:36 crc kubenswrapper[4696]: I1203 00:11:36.634785 4696 generic.go:334] "Generic (PLEG): container finished" podID="9256144a-07ee-4413-a5f4-dcf196be9171" containerID="c37b924ce874baeea8985f7ca1820663e5ceb523bc8bb1ff677b90d14fd6aea5" exitCode=0 Dec 03 00:11:36 crc kubenswrapper[4696]: I1203 00:11:36.634865 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" event={"ID":"9256144a-07ee-4413-a5f4-dcf196be9171","Type":"ContainerDied","Data":"c37b924ce874baeea8985f7ca1820663e5ceb523bc8bb1ff677b90d14fd6aea5"} Dec 03 00:11:36 crc kubenswrapper[4696]: I1203 00:11:36.685664 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-cqq59"] Dec 03 00:11:36 crc kubenswrapper[4696]: I1203 00:11:36.708594 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6s7rp/crc-debug-cqq59"] Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.466572 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721c91fd-5d68-4627-b723-34591ff2e704" path="/var/lib/kubelet/pods/721c91fd-5d68-4627-b723-34591ff2e704/volumes" Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.765335 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.864722 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kxlt\" (UniqueName: \"kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt\") pod \"9256144a-07ee-4413-a5f4-dcf196be9171\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.865028 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host\") pod \"9256144a-07ee-4413-a5f4-dcf196be9171\" (UID: \"9256144a-07ee-4413-a5f4-dcf196be9171\") " Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.865144 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host" (OuterVolumeSpecName: "host") pod "9256144a-07ee-4413-a5f4-dcf196be9171" (UID: "9256144a-07ee-4413-a5f4-dcf196be9171"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.865528 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9256144a-07ee-4413-a5f4-dcf196be9171-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.871846 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt" (OuterVolumeSpecName: "kube-api-access-9kxlt") pod "9256144a-07ee-4413-a5f4-dcf196be9171" (UID: "9256144a-07ee-4413-a5f4-dcf196be9171"). InnerVolumeSpecName "kube-api-access-9kxlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:11:37 crc kubenswrapper[4696]: I1203 00:11:37.968054 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kxlt\" (UniqueName: \"kubernetes.io/projected/9256144a-07ee-4413-a5f4-dcf196be9171-kube-api-access-9kxlt\") on node \"crc\" DevicePath \"\"" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.656865 4696 scope.go:117] "RemoveContainer" containerID="c37b924ce874baeea8985f7ca1820663e5ceb523bc8bb1ff677b90d14fd6aea5" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.656893 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/crc-debug-cqq59" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.824621 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:11:38 crc kubenswrapper[4696]: E1203 00:11:38.825596 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="registry-server" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825617 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="registry-server" Dec 03 00:11:38 crc kubenswrapper[4696]: E1203 00:11:38.825635 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="extract-content" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825642 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="extract-content" Dec 03 00:11:38 crc kubenswrapper[4696]: E1203 00:11:38.825666 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9256144a-07ee-4413-a5f4-dcf196be9171" containerName="container-00" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825673 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9256144a-07ee-4413-a5f4-dcf196be9171" containerName="container-00" Dec 03 00:11:38 crc kubenswrapper[4696]: E1203 00:11:38.825685 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="extract-utilities" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825692 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="extract-utilities" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825914 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="721c91fd-5d68-4627-b723-34591ff2e704" containerName="registry-server" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.825934 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9256144a-07ee-4413-a5f4-dcf196be9171" containerName="container-00" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.827879 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.840388 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.988358 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.988819 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqq28\" (UniqueName: \"kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:38 crc kubenswrapper[4696]: I1203 00:11:38.988947 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.091190 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.091275 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqq28\" (UniqueName: \"kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.091306 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.091872 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.092264 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.111514 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqq28\" (UniqueName: \"kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28\") pod \"redhat-operators-vnfkr\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.150908 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.461987 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9256144a-07ee-4413-a5f4-dcf196be9171" path="/var/lib/kubelet/pods/9256144a-07ee-4413-a5f4-dcf196be9171/volumes" Dec 03 00:11:39 crc kubenswrapper[4696]: I1203 00:11:39.730715 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:11:40 crc kubenswrapper[4696]: I1203 00:11:40.701909 4696 generic.go:334] "Generic (PLEG): container finished" podID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerID="780e40d51476365b86a3d73419edae58e223810d51d908cbe95c60e9e13df9da" exitCode=0 Dec 03 00:11:40 crc kubenswrapper[4696]: I1203 00:11:40.702301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerDied","Data":"780e40d51476365b86a3d73419edae58e223810d51d908cbe95c60e9e13df9da"} Dec 03 00:11:40 crc kubenswrapper[4696]: I1203 00:11:40.702335 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerStarted","Data":"ea088dc3fe9bc25e4a7cfd6baf103c46b318431fde6ee0978520d9e0210b335b"} Dec 03 00:11:42 crc kubenswrapper[4696]: I1203 00:11:42.723563 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerStarted","Data":"7b4eb6ab2bde2dd523b784a45bf6c962c7fa2f3f5f2909322b77165ee09d4162"} Dec 03 00:11:45 crc kubenswrapper[4696]: I1203 00:11:45.766098 4696 generic.go:334] "Generic (PLEG): container finished" podID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerID="7b4eb6ab2bde2dd523b784a45bf6c962c7fa2f3f5f2909322b77165ee09d4162" exitCode=0 Dec 03 00:11:45 crc kubenswrapper[4696]: I1203 00:11:45.766301 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerDied","Data":"7b4eb6ab2bde2dd523b784a45bf6c962c7fa2f3f5f2909322b77165ee09d4162"} Dec 03 00:11:47 crc kubenswrapper[4696]: I1203 00:11:47.790079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerStarted","Data":"c13148787d81caf472c56006e3fc8829135170a5cb47efffe782132a3235f6e6"} Dec 03 00:11:47 crc kubenswrapper[4696]: I1203 00:11:47.817807 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vnfkr" podStartSLOduration=3.539328461 podStartE2EDuration="9.817781331s" podCreationTimestamp="2025-12-03 00:11:38 +0000 UTC" firstStartedPulling="2025-12-03 00:11:40.704884746 +0000 UTC m=+5363.585564747" lastFinishedPulling="2025-12-03 00:11:46.983337616 +0000 UTC m=+5369.864017617" observedRunningTime="2025-12-03 00:11:47.809863747 +0000 UTC m=+5370.690543808" watchObservedRunningTime="2025-12-03 00:11:47.817781331 +0000 UTC m=+5370.698461332" Dec 03 00:11:49 crc kubenswrapper[4696]: I1203 00:11:49.151858 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:49 crc kubenswrapper[4696]: I1203 00:11:49.152305 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:50 crc kubenswrapper[4696]: I1203 00:11:50.221983 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vnfkr" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="registry-server" probeResult="failure" output=< Dec 03 00:11:50 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 03 00:11:50 crc kubenswrapper[4696]: > Dec 03 00:11:51 crc kubenswrapper[4696]: I1203 00:11:51.431927 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:11:51 crc kubenswrapper[4696]: E1203 00:11:51.432422 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:11:59 crc kubenswrapper[4696]: I1203 00:11:59.320503 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:59 crc kubenswrapper[4696]: I1203 00:11:59.371161 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:11:59 crc kubenswrapper[4696]: I1203 00:11:59.561604 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:12:00 crc kubenswrapper[4696]: I1203 00:12:00.944927 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vnfkr" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="registry-server" containerID="cri-o://c13148787d81caf472c56006e3fc8829135170a5cb47efffe782132a3235f6e6" gracePeriod=2 Dec 03 00:12:01 crc kubenswrapper[4696]: I1203 00:12:01.957768 4696 generic.go:334] "Generic (PLEG): container finished" podID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerID="c13148787d81caf472c56006e3fc8829135170a5cb47efffe782132a3235f6e6" exitCode=0 Dec 03 00:12:01 crc kubenswrapper[4696]: I1203 00:12:01.957866 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerDied","Data":"c13148787d81caf472c56006e3fc8829135170a5cb47efffe782132a3235f6e6"} Dec 03 00:12:01 crc kubenswrapper[4696]: I1203 00:12:01.958178 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnfkr" event={"ID":"422c40cd-1ecd-4fff-b74e-3e306b8e3b95","Type":"ContainerDied","Data":"ea088dc3fe9bc25e4a7cfd6baf103c46b318431fde6ee0978520d9e0210b335b"} Dec 03 00:12:01 crc kubenswrapper[4696]: I1203 00:12:01.958198 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea088dc3fe9bc25e4a7cfd6baf103c46b318431fde6ee0978520d9e0210b335b" Dec 03 00:12:01 crc kubenswrapper[4696]: I1203 00:12:01.966177 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.142048 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqq28\" (UniqueName: \"kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28\") pod \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.142815 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content\") pod \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.142860 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities\") pod \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\" (UID: \"422c40cd-1ecd-4fff-b74e-3e306b8e3b95\") " Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.144386 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities" (OuterVolumeSpecName: "utilities") pod "422c40cd-1ecd-4fff-b74e-3e306b8e3b95" (UID: "422c40cd-1ecd-4fff-b74e-3e306b8e3b95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.150305 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28" (OuterVolumeSpecName: "kube-api-access-wqq28") pod "422c40cd-1ecd-4fff-b74e-3e306b8e3b95" (UID: "422c40cd-1ecd-4fff-b74e-3e306b8e3b95"). InnerVolumeSpecName "kube-api-access-wqq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.245292 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.245340 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqq28\" (UniqueName: \"kubernetes.io/projected/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-kube-api-access-wqq28\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.289669 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "422c40cd-1ecd-4fff-b74e-3e306b8e3b95" (UID: "422c40cd-1ecd-4fff-b74e-3e306b8e3b95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.347205 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422c40cd-1ecd-4fff-b74e-3e306b8e3b95-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:02 crc kubenswrapper[4696]: I1203 00:12:02.432055 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:12:02 crc kubenswrapper[4696]: E1203 00:12:02.432440 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:02.967830 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnfkr" Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.046076 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.058437 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vnfkr"] Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.442886 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" path="/var/lib/kubelet/pods/422c40cd-1ecd-4fff-b74e-3e306b8e3b95/volumes" Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.736658 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-748c8487f8-gqxg9_4c68604e-222e-4a20-b829-c2f4e3c6923e/barbican-api/0.log" Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.861578 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-748c8487f8-gqxg9_4c68604e-222e-4a20-b829-c2f4e3c6923e/barbican-api-log/0.log" Dec 03 00:12:03 crc kubenswrapper[4696]: I1203 00:12:03.963906 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d6f88f57d-fkfhk_6a2e48f4-820d-4199-883c-f7d93f5f12c6/barbican-keystone-listener/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.027180 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d6f88f57d-fkfhk_6a2e48f4-820d-4199-883c-f7d93f5f12c6/barbican-keystone-listener-log/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.165200 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95b48849f-64t8k_314505e6-5f55-4c07-9692-c5698c6e3ff1/barbican-worker/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.223696 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95b48849f-64t8k_314505e6-5f55-4c07-9692-c5698c6e3ff1/barbican-worker-log/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.410699 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht_10db9578-c367-420b-ba4f-93729e4d9483/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.485623 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/ceilometer-central-agent/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.542595 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/ceilometer-notification-agent/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.622346 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/sg-core/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.635334 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/proxy-httpd/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.822709 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e0a0bd09-55c1-4eb0-bed1-76a920e67875/cinder-api/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.904311 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29412001-jvcl6_f9b76748-e694-4766-a355-d01c0fc857e0/cinder-db-purge/0.log" Dec 03 00:12:04 crc kubenswrapper[4696]: I1203 00:12:04.914594 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e0a0bd09-55c1-4eb0-bed1-76a920e67875/cinder-api-log/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.163800 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_17a7a7a1-8e9c-4b77-8a09-783b8b465cf5/cinder-scheduler/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.177205 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_17a7a7a1-8e9c-4b77-8a09-783b8b465cf5/probe/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.308272 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp_38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.390720 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz_4e57d59f-2b48-457e-92dd-d0585bab85b5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.707134 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/init/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.750828 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/init/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.850343 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/dnsmasq-dns/0.log" Dec 03 00:12:05 crc kubenswrapper[4696]: I1203 00:12:05.923591 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2_a29810cf-fd6b-4021-8ae5-52612fb63cfc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.083499 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29412001-htr7n_2cc2c412-c05d-4914-b5a4-e0a0e40b8a59/glance-dbpurge/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.167347 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3e14f14-0774-4eb2-aff3-231d72e6136f/glance-httpd/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.274864 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3e14f14-0774-4eb2-aff3-231d72e6136f/glance-log/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.401995 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7d71c8ce-55a6-4bbc-a450-128443762f36/glance-httpd/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.423534 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7d71c8ce-55a6-4bbc-a450-128443762f36/glance-log/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.668637 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c784657c6-hdbrw_b414fc10-9d51-456b-aaa9-d6b4dd08af99/horizon/0.log" Dec 03 00:12:06 crc kubenswrapper[4696]: I1203 00:12:06.805546 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b_d3d600ee-2200-406f-8d8b-f093851161fd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.013286 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2sw6r_f616e70d-4131-4ed5-b891-33dcad6a8827/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.199633 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c784657c6-hdbrw_b414fc10-9d51-456b-aaa9-d6b4dd08af99/horizon-log/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.486971 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412001-b2pg6_c211c59c-65a7-4672-8a9e-7b9d20220ef5/keystone-cron/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.505011 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c200fd15-55ea-4c23-a8d4-22c362deedee/kube-state-metrics/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.681698 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d45f5d76-ptzqb_6cc29833-0849-46ec-bc06-1c980ec2dc02/keystone-api/0.log" Dec 03 00:12:07 crc kubenswrapper[4696]: I1203 00:12:07.966627 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-526g5_5697ae7a-9589-4939-a3a5-5613ee6094ab/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:08 crc kubenswrapper[4696]: I1203 00:12:08.357323 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg_81a46783-f2f6-464b-a1cd-d859d59e0c99/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:08 crc kubenswrapper[4696]: I1203 00:12:08.381715 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67dbcf9bdf-2hr7m_797ad679-555c-4599-bc0c-21c0254a3a5a/neutron-httpd/0.log" Dec 03 00:12:08 crc kubenswrapper[4696]: I1203 00:12:08.437556 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67dbcf9bdf-2hr7m_797ad679-555c-4599-bc0c-21c0254a3a5a/neutron-api/0.log" Dec 03 00:12:08 crc kubenswrapper[4696]: I1203 00:12:08.992892 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f458554-1460-4379-95af-2313d4df2320/nova-cell0-conductor-conductor/0.log" Dec 03 00:12:09 crc kubenswrapper[4696]: I1203 00:12:09.128635 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29412000-ws52s_8f44b5e9-136f-4cba-9f87-6bb1d73fb496/nova-manage/0.log" Dec 03 00:12:09 crc kubenswrapper[4696]: I1203 00:12:09.497677 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4c7be3d4-52ad-4671-8b48-8cc19cf98b4c/nova-cell1-conductor-conductor/0.log" Dec 03 00:12:09 crc kubenswrapper[4696]: I1203 00:12:09.697469 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29412000-fd8ck_0a2082a5-4293-40c5-ad8d-bb7a4bc43626/nova-manage/0.log" Dec 03 00:12:09 crc kubenswrapper[4696]: I1203 00:12:09.733546 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6cb033f1-9348-4822-b022-daef2e06af49/nova-api-log/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.071751 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2ed66bd7-4f5d-4501-b81f-51939db42c64/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.150379 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6cb033f1-9348-4822-b022-daef2e06af49/nova-api-api/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.158491 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-2dd9x_7b36c51f-9889-4191-a7ea-b54a79542e0b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.395956 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_60554cd9-644e-40c0-90c9-57610b92846e/nova-metadata-log/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.734802 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/mysql-bootstrap/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.891140 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/mysql-bootstrap/0.log" Dec 03 00:12:10 crc kubenswrapper[4696]: I1203 00:12:10.946417 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/galera/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.049778 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_90b9eaeb-1865-4418-8665-1e65f0fb8151/nova-scheduler-scheduler/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.174452 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/mysql-bootstrap/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.449555 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/mysql-bootstrap/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.492368 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/galera/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.665940 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3e0a050b-c652-4ef2-8f1a-19c8f4732a0c/openstackclient/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.782679 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-btsm6_b13b6998-c04a-4ac8-9615-5078f1169ecb/ovn-controller/0.log" Dec 03 00:12:11 crc kubenswrapper[4696]: I1203 00:12:11.966682 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-f6gwc_5f108f43-c0d4-4026-9f97-3a2fc3698626/openstack-network-exporter/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.171297 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server-init/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.461464 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.464424 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovs-vswitchd/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.466031 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server-init/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.727061 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-d7r49_9ac484be-e201-4b74-a21e-502131efc1e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.735213 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_60554cd9-644e-40c0-90c9-57610b92846e/nova-metadata-metadata/0.log" Dec 03 00:12:12 crc kubenswrapper[4696]: I1203 00:12:12.939476 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87dfe190-5b7f-48c2-bfa0-97ca227eabb2/openstack-network-exporter/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.040801 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd18044c-bd73-4166-83ac-e555f2a587b3/openstack-network-exporter/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.054735 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87dfe190-5b7f-48c2-bfa0-97ca227eabb2/ovn-northd/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.181404 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd18044c-bd73-4166-83ac-e555f2a587b3/ovsdbserver-nb/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.290401 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd863731-0190-4818-90bb-a7b5b781e616/openstack-network-exporter/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.351009 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd863731-0190-4818-90bb-a7b5b781e616/ovsdbserver-sb/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.731294 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599746d6dd-mg2dx_fe44184e-95f9-4a2e-a6a4-e2534c44e933/placement-api/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.735973 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/init-config-reloader/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.839432 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599746d6dd-mg2dx_fe44184e-95f9-4a2e-a6a4-e2534c44e933/placement-log/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.957025 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/init-config-reloader/0.log" Dec 03 00:12:13 crc kubenswrapper[4696]: I1203 00:12:13.990924 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/config-reloader/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.062302 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/prometheus/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.092092 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/thanos-sidecar/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.266075 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/setup-container/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.485459 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/rabbitmq/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.494559 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/setup-container/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.510962 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/setup-container/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.720926 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/setup-container/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.756162 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/rabbitmq/0.log" Dec 03 00:12:14 crc kubenswrapper[4696]: I1203 00:12:14.829176 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb_00408801-09ea-4d50-a657-b01117a2f51b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.044593 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fjhhb_ba5e6341-d5c7-41b8-adf8-59f3036d3838/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.080044 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4_d6b40a87-ecaf-4f50-a3e0-04235ce0f029/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.342393 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6qlp7_96d33f70-c859-4df1-9e0c-94fa64d60a41/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.348443 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dkdq9_540fd942-5964-4e7f-a40f-66102876bd8c/ssh-known-hosts-edpm-deployment/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.555906 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5964f98dd9-7q2kj_a76ff35f-36d6-48df-94ed-337199547cd5/proxy-server/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.799349 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kq468_955c99b3-ad42-4e65-a391-47eda1c4130a/swift-ring-rebalance/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.809503 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5964f98dd9-7q2kj_a76ff35f-36d6-48df-94ed-337199547cd5/proxy-httpd/0.log" Dec 03 00:12:15 crc kubenswrapper[4696]: I1203 00:12:15.937867 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-auditor/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.036642 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-replicator/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.072403 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-reaper/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.127067 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-server/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.177887 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-auditor/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.327039 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-server/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.341655 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-replicator/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.380573 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-updater/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.416781 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-auditor/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.431447 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:12:16 crc kubenswrapper[4696]: E1203 00:12:16.431878 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.560559 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-expirer/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.588319 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-server/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.676257 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-replicator/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.730496 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-updater/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.802131 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/rsync/0.log" Dec 03 00:12:16 crc kubenswrapper[4696]: I1203 00:12:16.832203 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/swift-recon-cron/0.log" Dec 03 00:12:17 crc kubenswrapper[4696]: I1203 00:12:17.040001 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq_3c9ec356-4712-4484-9b78-9e5d4831dac1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:17 crc kubenswrapper[4696]: I1203 00:12:17.740206 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88/test-operator-logs-container/0.log" Dec 03 00:12:17 crc kubenswrapper[4696]: I1203 00:12:17.779426 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4881d1aa-7494-45fe-b21b-5cae7bfe2f41/tempest-tests-tempest-tests-runner/0.log" Dec 03 00:12:18 crc kubenswrapper[4696]: I1203 00:12:18.070762 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw_130877a2-12e6-4731-9f64-675fcfd8a1ce/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:12:18 crc kubenswrapper[4696]: I1203 00:12:18.760142 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_88764a17-d8c0-447f-923a-4afd6c522e43/watcher-applier/0.log" Dec 03 00:12:19 crc kubenswrapper[4696]: I1203 00:12:19.320955 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_3ae28daa-ab18-478f-ac27-6be4b2d632d3/watcher-api-log/0.log" Dec 03 00:12:20 crc kubenswrapper[4696]: I1203 00:12:20.221298 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_9703e3e9-39e6-4c7f-a1ea-324a4f26c18a/watcher-decision-engine/0.log" Dec 03 00:12:22 crc kubenswrapper[4696]: I1203 00:12:22.683129 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_3ae28daa-ab18-478f-ac27-6be4b2d632d3/watcher-api/0.log" Dec 03 00:12:27 crc kubenswrapper[4696]: I1203 00:12:27.442836 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:12:27 crc kubenswrapper[4696]: E1203 00:12:27.443732 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:12:27 crc kubenswrapper[4696]: I1203 00:12:27.969183 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_87c53ac9-38ab-43a7-b99e-29c47a69f818/memcached/0.log" Dec 03 00:12:38 crc kubenswrapper[4696]: I1203 00:12:38.432090 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:12:38 crc kubenswrapper[4696]: E1203 00:12:38.433257 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:12:49 crc kubenswrapper[4696]: I1203 00:12:49.432370 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:12:49 crc kubenswrapper[4696]: E1203 00:12:49.433516 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.507617 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.704496 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.704658 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.769264 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.960552 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/extract/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.964672 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:12:51 crc kubenswrapper[4696]: I1203 00:12:51.978565 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.156173 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xxm5w_351a13fb-8e8e-4393-adef-28523ab05ccb/kube-rbac-proxy/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.201356 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rjlxh_755f9574-a31b-430c-a2a2-92554020d96b/kube-rbac-proxy/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.272348 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xxm5w_351a13fb-8e8e-4393-adef-28523ab05ccb/manager/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.430253 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rjlxh_755f9574-a31b-430c-a2a2-92554020d96b/manager/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.473955 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-tzgnf_5706d5c2-8bbe-40b3-8820-0d547363fa96/kube-rbac-proxy/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.491352 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-tzgnf_5706d5c2-8bbe-40b3-8820-0d547363fa96/manager/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.740977 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvqf8_e693a226-52c3-413c-b607-c0050ab5e553/kube-rbac-proxy/0.log" Dec 03 00:12:52 crc kubenswrapper[4696]: I1203 00:12:52.783448 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvqf8_e693a226-52c3-413c-b607-c0050ab5e553/manager/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.075938 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s9pk4_5315d589-3bb7-4776-b842-ffc18e1a89e1/kube-rbac-proxy/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.190836 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s9pk4_5315d589-3bb7-4776-b842-ffc18e1a89e1/manager/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.196452 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tng2n_7d7b7caa-1ec3-4e66-9273-36cae02cbe8e/kube-rbac-proxy/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.342185 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tng2n_7d7b7caa-1ec3-4e66-9273-36cae02cbe8e/manager/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.426429 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kz6bs_66d51ef3-89ba-4653-ae46-5469bfc5232e/kube-rbac-proxy/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.672913 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4crn9_bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094/kube-rbac-proxy/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.686430 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4crn9_bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094/manager/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.760652 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kz6bs_66d51ef3-89ba-4653-ae46-5469bfc5232e/manager/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.938489 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-vg9kf_9207b2f0-999a-45e4-8234-982f796f7801/kube-rbac-proxy/0.log" Dec 03 00:12:53 crc kubenswrapper[4696]: I1203 00:12:53.987342 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-vg9kf_9207b2f0-999a-45e4-8234-982f796f7801/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.127050 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k2p9j_5436ce3c-34d6-47eb-81b1-3b4dc1c2d794/kube-rbac-proxy/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.161665 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k2p9j_5436ce3c-34d6-47eb-81b1-3b4dc1c2d794/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.230379 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tp7td_baad852a-374a-460e-9d5c-cb5418291849/kube-rbac-proxy/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.337010 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tp7td_baad852a-374a-460e-9d5c-cb5418291849/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.444608 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d6qww_1beb3e53-4faf-475f-b5b0-57b8cd32c529/kube-rbac-proxy/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.520500 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d6qww_1beb3e53-4faf-475f-b5b0-57b8cd32c529/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.591069 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-hfcc6_9c1744f3-fc58-4653-a7e0-4fcdfdfca485/kube-rbac-proxy/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.774855 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-hfcc6_9c1744f3-fc58-4653-a7e0-4fcdfdfca485/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.799237 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bpk25_060c8046-7775-413d-9797-ef0edcee01dd/kube-rbac-proxy/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.800706 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bpk25_060c8046-7775-413d-9797-ef0edcee01dd/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.972594 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg_6e335d65-9d0f-4ace-97cc-70a4a2bb2291/manager/0.log" Dec 03 00:12:54 crc kubenswrapper[4696]: I1203 00:12:54.980640 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg_6e335d65-9d0f-4ace-97cc-70a4a2bb2291/kube-rbac-proxy/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.347856 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-dfb58c988-g96v2_f7e88453-0fd4-401a-92cd-f75809f14f21/operator/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.357771 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k25sp_2bbe83e8-36bc-401e-84b6-917b6aeb6398/registry-server/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.571125 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pwblb_77131fa7-a611-46bf-b0fe-d05d909dfd4c/kube-rbac-proxy/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.648677 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pwblb_77131fa7-a611-46bf-b0fe-d05d909dfd4c/manager/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.680418 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-g5pxw_3d6a00c3-b537-414a-8ba4-2797d7bc88f8/kube-rbac-proxy/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.846619 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-g5pxw_3d6a00c3-b537-414a-8ba4-2797d7bc88f8/manager/0.log" Dec 03 00:12:55 crc kubenswrapper[4696]: I1203 00:12:55.984865 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b2wbs_e53eb416-2701-4080-b0a3-bbeae35013a4/operator/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.069635 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-m472w_9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e/kube-rbac-proxy/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.158971 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-744c6b777f-bjtk5_0508daaa-b26a-4f05-9abc-f63ac69fd1d5/manager/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.204477 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-m472w_9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e/manager/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.411422 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-58fzc_45f0f590-24f6-4f01-98a0-a41508a59f5a/kube-rbac-proxy/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.455831 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-58fzc_45f0f590-24f6-4f01-98a0-a41508a59f5a/manager/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.730565 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-jfcsj_796c18e3-0c33-4393-aba8-2ad03aad4b93/manager/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.733685 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-jfcsj_796c18e3-0c33-4393-aba8-2ad03aad4b93/kube-rbac-proxy/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.874865 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d4477bdf4-lxz2l_d417ecee-aebb-4154-ac0c-2c321bd78182/kube-rbac-proxy/0.log" Dec 03 00:12:56 crc kubenswrapper[4696]: I1203 00:12:56.961590 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d4477bdf4-lxz2l_d417ecee-aebb-4154-ac0c-2c321bd78182/manager/0.log" Dec 03 00:13:01 crc kubenswrapper[4696]: I1203 00:13:01.432911 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:13:01 crc kubenswrapper[4696]: E1203 00:13:01.434334 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:13:13 crc kubenswrapper[4696]: I1203 00:13:13.431854 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:13:13 crc kubenswrapper[4696]: E1203 00:13:13.433035 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:13:18 crc kubenswrapper[4696]: I1203 00:13:18.396696 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-58dc7_bfd55522-63bd-40f3-a429-eb0c85fe5b9c/control-plane-machine-set-operator/0.log" Dec 03 00:13:18 crc kubenswrapper[4696]: I1203 00:13:18.635337 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rttcw_ffa64292-b071-4bfc-93d6-70d65b00847d/machine-api-operator/0.log" Dec 03 00:13:18 crc kubenswrapper[4696]: I1203 00:13:18.689273 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rttcw_ffa64292-b071-4bfc-93d6-70d65b00847d/kube-rbac-proxy/0.log" Dec 03 00:13:27 crc kubenswrapper[4696]: I1203 00:13:27.439117 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:13:27 crc kubenswrapper[4696]: E1203 00:13:27.440139 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:13:33 crc kubenswrapper[4696]: I1203 00:13:33.459655 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-jbtws_cbe42e42-7252-40cb-bfe8-7484eb822ff9/cert-manager-controller/0.log" Dec 03 00:13:33 crc kubenswrapper[4696]: I1203 00:13:33.737084 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jlqs7_72d4a613-3c9c-4b7d-a840-3c76247572f6/cert-manager-webhook/0.log" Dec 03 00:13:33 crc kubenswrapper[4696]: I1203 00:13:33.742359 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-n8ntr_2af9e90d-fb84-4f01-9ed3-c0c1eaef6369/cert-manager-cainjector/0.log" Dec 03 00:13:40 crc kubenswrapper[4696]: I1203 00:13:40.433017 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:13:40 crc kubenswrapper[4696]: E1203 00:13:40.434115 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.785780 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:42 crc kubenswrapper[4696]: E1203 00:13:42.787617 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="extract-content" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.787638 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="extract-content" Dec 03 00:13:42 crc kubenswrapper[4696]: E1203 00:13:42.787697 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="extract-utilities" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.787711 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="extract-utilities" Dec 03 00:13:42 crc kubenswrapper[4696]: E1203 00:13:42.787775 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="registry-server" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.787788 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="registry-server" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.788262 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="422c40cd-1ecd-4fff-b74e-3e306b8e3b95" containerName="registry-server" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.790658 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.800067 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.865108 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.865264 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.865356 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mc5l\" (UniqueName: \"kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.966818 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.967211 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.967340 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mc5l\" (UniqueName: \"kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.968776 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:42 crc kubenswrapper[4696]: I1203 00:13:42.969211 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:43 crc kubenswrapper[4696]: I1203 00:13:43.584921 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mc5l\" (UniqueName: \"kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l\") pod \"redhat-marketplace-jwpt9\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:43 crc kubenswrapper[4696]: I1203 00:13:43.725959 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:44 crc kubenswrapper[4696]: I1203 00:13:44.202359 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:44 crc kubenswrapper[4696]: W1203 00:13:44.206794 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a13cb2_d2e8_4aff_a259_c309f2862df3.slice/crio-5aab8a2b2f409933d84343efa4ea7ddfa28489764a4ea16a9e3bfc28cbaadda8 WatchSource:0}: Error finding container 5aab8a2b2f409933d84343efa4ea7ddfa28489764a4ea16a9e3bfc28cbaadda8: Status 404 returned error can't find the container with id 5aab8a2b2f409933d84343efa4ea7ddfa28489764a4ea16a9e3bfc28cbaadda8 Dec 03 00:13:44 crc kubenswrapper[4696]: I1203 00:13:44.583293 4696 generic.go:334] "Generic (PLEG): container finished" podID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerID="71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302" exitCode=0 Dec 03 00:13:44 crc kubenswrapper[4696]: I1203 00:13:44.583399 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerDied","Data":"71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302"} Dec 03 00:13:44 crc kubenswrapper[4696]: I1203 00:13:44.583912 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerStarted","Data":"5aab8a2b2f409933d84343efa4ea7ddfa28489764a4ea16a9e3bfc28cbaadda8"} Dec 03 00:13:45 crc kubenswrapper[4696]: I1203 00:13:45.597026 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerStarted","Data":"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966"} Dec 03 00:13:46 crc kubenswrapper[4696]: I1203 00:13:46.609880 4696 generic.go:334] "Generic (PLEG): container finished" podID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerID="7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966" exitCode=0 Dec 03 00:13:46 crc kubenswrapper[4696]: I1203 00:13:46.609979 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerDied","Data":"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966"} Dec 03 00:13:47 crc kubenswrapper[4696]: I1203 00:13:47.623995 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerStarted","Data":"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b"} Dec 03 00:13:47 crc kubenswrapper[4696]: I1203 00:13:47.668697 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jwpt9" podStartSLOduration=3.206602695 podStartE2EDuration="5.668672963s" podCreationTimestamp="2025-12-03 00:13:42 +0000 UTC" firstStartedPulling="2025-12-03 00:13:44.585190525 +0000 UTC m=+5487.465870526" lastFinishedPulling="2025-12-03 00:13:47.047260793 +0000 UTC m=+5489.927940794" observedRunningTime="2025-12-03 00:13:47.656618922 +0000 UTC m=+5490.537298923" watchObservedRunningTime="2025-12-03 00:13:47.668672963 +0000 UTC m=+5490.549352964" Dec 03 00:13:47 crc kubenswrapper[4696]: I1203 00:13:47.901467 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-nl682_16ae587c-763d-46f6-b211-e9b3752339c9/nmstate-console-plugin/0.log" Dec 03 00:13:48 crc kubenswrapper[4696]: I1203 00:13:48.068794 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8mz82_e69b657f-75dd-418a-80f8-1e3820f1ff88/nmstate-handler/0.log" Dec 03 00:13:48 crc kubenswrapper[4696]: I1203 00:13:48.147260 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lcb5l_52790af0-09aa-4b8f-8350-054135e80896/nmstate-metrics/0.log" Dec 03 00:13:48 crc kubenswrapper[4696]: I1203 00:13:48.148001 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lcb5l_52790af0-09aa-4b8f-8350-054135e80896/kube-rbac-proxy/0.log" Dec 03 00:13:48 crc kubenswrapper[4696]: I1203 00:13:48.348207 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-srjgk_0301b6ea-801b-41a5-b96a-018412c37fc8/nmstate-webhook/0.log" Dec 03 00:13:48 crc kubenswrapper[4696]: I1203 00:13:48.350607 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-cts9d_98262b9a-2be3-48d1-becc-84c3e9585c46/nmstate-operator/0.log" Dec 03 00:13:52 crc kubenswrapper[4696]: I1203 00:13:52.432570 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:13:52 crc kubenswrapper[4696]: E1203 00:13:52.433635 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:13:53 crc kubenswrapper[4696]: I1203 00:13:53.726322 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:53 crc kubenswrapper[4696]: I1203 00:13:53.726841 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:53 crc kubenswrapper[4696]: I1203 00:13:53.778264 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:54 crc kubenswrapper[4696]: I1203 00:13:54.751158 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:54 crc kubenswrapper[4696]: I1203 00:13:54.807235 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:56 crc kubenswrapper[4696]: I1203 00:13:56.720389 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jwpt9" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="registry-server" containerID="cri-o://ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b" gracePeriod=2 Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.250932 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.407142 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities\") pod \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.408208 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities" (OuterVolumeSpecName: "utilities") pod "d0a13cb2-d2e8-4aff-a259-c309f2862df3" (UID: "d0a13cb2-d2e8-4aff-a259-c309f2862df3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.421723 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content\") pod \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.422843 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mc5l\" (UniqueName: \"kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l\") pod \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\" (UID: \"d0a13cb2-d2e8-4aff-a259-c309f2862df3\") " Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.424572 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.446497 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l" (OuterVolumeSpecName: "kube-api-access-4mc5l") pod "d0a13cb2-d2e8-4aff-a259-c309f2862df3" (UID: "d0a13cb2-d2e8-4aff-a259-c309f2862df3"). InnerVolumeSpecName "kube-api-access-4mc5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.483161 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a13cb2-d2e8-4aff-a259-c309f2862df3" (UID: "d0a13cb2-d2e8-4aff-a259-c309f2862df3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.526879 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mc5l\" (UniqueName: \"kubernetes.io/projected/d0a13cb2-d2e8-4aff-a259-c309f2862df3-kube-api-access-4mc5l\") on node \"crc\" DevicePath \"\"" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.527191 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a13cb2-d2e8-4aff-a259-c309f2862df3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.738273 4696 generic.go:334] "Generic (PLEG): container finished" podID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerID="ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b" exitCode=0 Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.738442 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerDied","Data":"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b"} Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.738578 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwpt9" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.738772 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwpt9" event={"ID":"d0a13cb2-d2e8-4aff-a259-c309f2862df3","Type":"ContainerDied","Data":"5aab8a2b2f409933d84343efa4ea7ddfa28489764a4ea16a9e3bfc28cbaadda8"} Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.738794 4696 scope.go:117] "RemoveContainer" containerID="ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.764398 4696 scope.go:117] "RemoveContainer" containerID="7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966" Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.767359 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.777931 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwpt9"] Dec 03 00:13:57 crc kubenswrapper[4696]: I1203 00:13:57.789048 4696 scope.go:117] "RemoveContainer" containerID="71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.544780 4696 scope.go:117] "RemoveContainer" containerID="ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b" Dec 03 00:13:58 crc kubenswrapper[4696]: E1203 00:13:58.545856 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b\": container with ID starting with ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b not found: ID does not exist" containerID="ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.545922 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b"} err="failed to get container status \"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b\": rpc error: code = NotFound desc = could not find container \"ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b\": container with ID starting with ae63fc5c4b3e52bac6d43053e10514c4f9da10c4691e350d290dec717f92821b not found: ID does not exist" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.545965 4696 scope.go:117] "RemoveContainer" containerID="7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966" Dec 03 00:13:58 crc kubenswrapper[4696]: E1203 00:13:58.546663 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966\": container with ID starting with 7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966 not found: ID does not exist" containerID="7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.546696 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966"} err="failed to get container status \"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966\": rpc error: code = NotFound desc = could not find container \"7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966\": container with ID starting with 7d112d178a87b7f0054fed5eedbc8e1baff55b1dbd23939f8ef00763335c1966 not found: ID does not exist" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.546714 4696 scope.go:117] "RemoveContainer" containerID="71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302" Dec 03 00:13:58 crc kubenswrapper[4696]: E1203 00:13:58.548107 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302\": container with ID starting with 71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302 not found: ID does not exist" containerID="71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302" Dec 03 00:13:58 crc kubenswrapper[4696]: I1203 00:13:58.548139 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302"} err="failed to get container status \"71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302\": rpc error: code = NotFound desc = could not find container \"71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302\": container with ID starting with 71655ddaf2bb7fd5056168e15cd39e96165cad417079d2972d79f80388775302 not found: ID does not exist" Dec 03 00:13:59 crc kubenswrapper[4696]: I1203 00:13:59.444513 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" path="/var/lib/kubelet/pods/d0a13cb2-d2e8-4aff-a259-c309f2862df3/volumes" Dec 03 00:14:04 crc kubenswrapper[4696]: I1203 00:14:04.708700 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-86pps_e4a5e393-9801-4de5-86b3-ac2cb60bcdae/kube-rbac-proxy/0.log" Dec 03 00:14:04 crc kubenswrapper[4696]: I1203 00:14:04.873940 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-86pps_e4a5e393-9801-4de5-86b3-ac2cb60bcdae/controller/0.log" Dec 03 00:14:04 crc kubenswrapper[4696]: I1203 00:14:04.953927 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.144298 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.155877 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.158376 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.199425 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.434472 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.449770 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.452498 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.505332 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.686589 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.690063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/controller/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.694125 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.732989 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.915632 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/kube-rbac-proxy/0.log" Dec 03 00:14:05 crc kubenswrapper[4696]: I1203 00:14:05.943395 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/frr-metrics/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.007101 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/kube-rbac-proxy-frr/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.129972 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/reloader/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.259413 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p5cdf_bc16ac0a-e284-468e-b6a9-a8b78572ac06/frr-k8s-webhook-server/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.427259 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c67fd5d6c-gjrcg_978a6167-34da-4d05-a693-a9f7f4d865b2/manager/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.639811 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c7867ffbb-nxw6n_4e9c6038-441a-483a-b7e3-ff298010cf18/webhook-server/0.log" Dec 03 00:14:06 crc kubenswrapper[4696]: I1203 00:14:06.993233 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xqrp_58e7c36f-4f09-4ae1-99ce-e18c2612b6ec/kube-rbac-proxy/0.log" Dec 03 00:14:07 crc kubenswrapper[4696]: I1203 00:14:07.443287 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:14:07 crc kubenswrapper[4696]: E1203 00:14:07.443989 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:14:07 crc kubenswrapper[4696]: I1203 00:14:07.644281 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xqrp_58e7c36f-4f09-4ae1-99ce-e18c2612b6ec/speaker/0.log" Dec 03 00:14:07 crc kubenswrapper[4696]: I1203 00:14:07.803307 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/frr/0.log" Dec 03 00:14:19 crc kubenswrapper[4696]: I1203 00:14:19.432329 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:14:19 crc kubenswrapper[4696]: E1203 00:14:19.433568 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.478846 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.648878 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.651645 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.679776 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.849863 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.855721 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:14:22 crc kubenswrapper[4696]: I1203 00:14:22.871073 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/extract/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.039640 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.212132 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.220080 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.252311 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.418993 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.420998 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.455348 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/extract/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.602678 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.760282 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.790088 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.794890 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.986073 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:14:23 crc kubenswrapper[4696]: I1203 00:14:23.990664 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/extract/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.021321 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.206360 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.383164 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.396907 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.402102 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.555184 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.574588 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:14:24 crc kubenswrapper[4696]: I1203 00:14:24.804824 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.082946 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.083063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.138336 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.310391 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.329169 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.333486 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/registry-server/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.573275 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.622309 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fqzv2_09917c11-8312-4f5a-9597-ad0570d0aeb0/marketplace-operator/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.918173 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.944727 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:14:25 crc kubenswrapper[4696]: I1203 00:14:25.963642 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.133160 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/registry-server/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.155260 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.186779 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.368533 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/registry-server/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.400056 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.535370 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.541326 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.575788 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.750351 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:14:26 crc kubenswrapper[4696]: I1203 00:14:26.752099 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:14:27 crc kubenswrapper[4696]: I1203 00:14:27.391352 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/registry-server/0.log" Dec 03 00:14:31 crc kubenswrapper[4696]: I1203 00:14:31.431835 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:14:31 crc kubenswrapper[4696]: E1203 00:14:31.433032 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:14:40 crc kubenswrapper[4696]: I1203 00:14:40.167525 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-4rjbb_ea569052-61d0-4847-90f1-3e085d6a5363/prometheus-operator/0.log" Dec 03 00:14:40 crc kubenswrapper[4696]: I1203 00:14:40.377076 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bfc855f8c-28cln_e25bae4c-bb72-4fe7-8f1b-f6e61100727c/prometheus-operator-admission-webhook/0.log" Dec 03 00:14:40 crc kubenswrapper[4696]: I1203 00:14:40.403150 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn_6cef468e-8250-42c5-8ae4-75dccc1b10a5/prometheus-operator-admission-webhook/0.log" Dec 03 00:14:40 crc kubenswrapper[4696]: I1203 00:14:40.607396 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rc522_9ee2e5e1-ad58-448a-973e-2207d5cde11b/operator/0.log" Dec 03 00:14:40 crc kubenswrapper[4696]: I1203 00:14:40.643218 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-t8wvk_20a6ffb2-7272-4be4-9ed2-ba78389166d6/perses-operator/0.log" Dec 03 00:14:45 crc kubenswrapper[4696]: I1203 00:14:45.431452 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:14:45 crc kubenswrapper[4696]: E1203 00:14:45.432699 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:14:59 crc kubenswrapper[4696]: I1203 00:14:59.432739 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:14:59 crc kubenswrapper[4696]: E1203 00:14:59.433869 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.158658 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn"] Dec 03 00:15:00 crc kubenswrapper[4696]: E1203 00:15:00.159890 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.159918 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4696]: E1203 00:15:00.159943 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="extract-content" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.159951 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="extract-content" Dec 03 00:15:00 crc kubenswrapper[4696]: E1203 00:15:00.159980 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="extract-utilities" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.159987 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="extract-utilities" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.160207 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a13cb2-d2e8-4aff-a259-c309f2862df3" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.161265 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.166996 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.168443 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.175271 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn"] Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.315976 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.316081 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhnz\" (UniqueName: \"kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.316199 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.418210 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.418286 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhnz\" (UniqueName: \"kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.418313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.419546 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.443340 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.446685 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhnz\" (UniqueName: \"kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz\") pod \"collect-profiles-29412015-pp4nn\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:00 crc kubenswrapper[4696]: I1203 00:15:00.496653 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:01 crc kubenswrapper[4696]: I1203 00:15:01.429865 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn"] Dec 03 00:15:02 crc kubenswrapper[4696]: I1203 00:15:02.432888 4696 generic.go:334] "Generic (PLEG): container finished" podID="be5e360f-8129-40fb-8757-b5e061445745" containerID="c06094ec829a0f7c7bafcd5d5576b22dd257a7c7b9c3a4d5c34a8fe8cd488b53" exitCode=0 Dec 03 00:15:02 crc kubenswrapper[4696]: I1203 00:15:02.432944 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" event={"ID":"be5e360f-8129-40fb-8757-b5e061445745","Type":"ContainerDied","Data":"c06094ec829a0f7c7bafcd5d5576b22dd257a7c7b9c3a4d5c34a8fe8cd488b53"} Dec 03 00:15:02 crc kubenswrapper[4696]: I1203 00:15:02.432984 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" event={"ID":"be5e360f-8129-40fb-8757-b5e061445745","Type":"ContainerStarted","Data":"9e84e81f9335a36225e481ac2e5c647e112acec9d91b94984e30eb48cdb9ed51"} Dec 03 00:15:03 crc kubenswrapper[4696]: I1203 00:15:03.843175 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.006132 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume\") pod \"be5e360f-8129-40fb-8757-b5e061445745\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.006210 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume\") pod \"be5e360f-8129-40fb-8757-b5e061445745\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.006283 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qhnz\" (UniqueName: \"kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz\") pod \"be5e360f-8129-40fb-8757-b5e061445745\" (UID: \"be5e360f-8129-40fb-8757-b5e061445745\") " Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.008049 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume" (OuterVolumeSpecName: "config-volume") pod "be5e360f-8129-40fb-8757-b5e061445745" (UID: "be5e360f-8129-40fb-8757-b5e061445745"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.014637 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be5e360f-8129-40fb-8757-b5e061445745" (UID: "be5e360f-8129-40fb-8757-b5e061445745"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.023731 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz" (OuterVolumeSpecName: "kube-api-access-5qhnz") pod "be5e360f-8129-40fb-8757-b5e061445745" (UID: "be5e360f-8129-40fb-8757-b5e061445745"). InnerVolumeSpecName "kube-api-access-5qhnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.108994 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e360f-8129-40fb-8757-b5e061445745-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.109039 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e360f-8129-40fb-8757-b5e061445745-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.109056 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qhnz\" (UniqueName: \"kubernetes.io/projected/be5e360f-8129-40fb-8757-b5e061445745-kube-api-access-5qhnz\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.461376 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" event={"ID":"be5e360f-8129-40fb-8757-b5e061445745","Type":"ContainerDied","Data":"9e84e81f9335a36225e481ac2e5c647e112acec9d91b94984e30eb48cdb9ed51"} Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.461432 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e84e81f9335a36225e481ac2e5c647e112acec9d91b94984e30eb48cdb9ed51" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.461473 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-pp4nn" Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.941633 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl"] Dec 03 00:15:04 crc kubenswrapper[4696]: I1203 00:15:04.957453 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-d2zkl"] Dec 03 00:15:05 crc kubenswrapper[4696]: I1203 00:15:05.446600 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9deb9e1b-418a-413b-8329-fbbfadd5660d" path="/var/lib/kubelet/pods/9deb9e1b-418a-413b-8329-fbbfadd5660d/volumes" Dec 03 00:15:11 crc kubenswrapper[4696]: I1203 00:15:11.433630 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:15:11 crc kubenswrapper[4696]: E1203 00:15:11.435342 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:15:26 crc kubenswrapper[4696]: I1203 00:15:26.432356 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:15:26 crc kubenswrapper[4696]: I1203 00:15:26.745130 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000"} Dec 03 00:15:39 crc kubenswrapper[4696]: I1203 00:15:39.370897 4696 scope.go:117] "RemoveContainer" containerID="c91076e2a94cdd9aedf04eb78e7256e8304d3460a835d5d53f35d30170a06b1f" Dec 03 00:16:47 crc kubenswrapper[4696]: I1203 00:16:47.663291 4696 generic.go:334] "Generic (PLEG): container finished" podID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerID="ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116" exitCode=0 Dec 03 00:16:47 crc kubenswrapper[4696]: I1203 00:16:47.663388 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" event={"ID":"f800a297-5f59-4fae-9a3a-b326cc8a29e4","Type":"ContainerDied","Data":"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116"} Dec 03 00:16:47 crc kubenswrapper[4696]: I1203 00:16:47.664857 4696 scope.go:117] "RemoveContainer" containerID="ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116" Dec 03 00:16:47 crc kubenswrapper[4696]: I1203 00:16:47.899487 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6s7rp_must-gather-9r5nt_f800a297-5f59-4fae-9a3a-b326cc8a29e4/gather/0.log" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.077159 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6s7rp/must-gather-9r5nt"] Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.078403 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="copy" containerID="cri-o://73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108" gracePeriod=2 Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.090703 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6s7rp/must-gather-9r5nt"] Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.558886 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6s7rp_must-gather-9r5nt_f800a297-5f59-4fae-9a3a-b326cc8a29e4/copy/0.log" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.559794 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.749294 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output\") pod \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.749391 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb9wq\" (UniqueName: \"kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq\") pod \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\" (UID: \"f800a297-5f59-4fae-9a3a-b326cc8a29e4\") " Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.757914 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq" (OuterVolumeSpecName: "kube-api-access-zb9wq") pod "f800a297-5f59-4fae-9a3a-b326cc8a29e4" (UID: "f800a297-5f59-4fae-9a3a-b326cc8a29e4"). InnerVolumeSpecName "kube-api-access-zb9wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.771663 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6s7rp_must-gather-9r5nt_f800a297-5f59-4fae-9a3a-b326cc8a29e4/copy/0.log" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.772219 4696 generic.go:334] "Generic (PLEG): container finished" podID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerID="73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108" exitCode=143 Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.772282 4696 scope.go:117] "RemoveContainer" containerID="73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.772336 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6s7rp/must-gather-9r5nt" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.824662 4696 scope.go:117] "RemoveContainer" containerID="ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.851896 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb9wq\" (UniqueName: \"kubernetes.io/projected/f800a297-5f59-4fae-9a3a-b326cc8a29e4-kube-api-access-zb9wq\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.926581 4696 scope.go:117] "RemoveContainer" containerID="73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108" Dec 03 00:16:57 crc kubenswrapper[4696]: E1203 00:16:57.927511 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108\": container with ID starting with 73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108 not found: ID does not exist" containerID="73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.927583 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108"} err="failed to get container status \"73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108\": rpc error: code = NotFound desc = could not find container \"73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108\": container with ID starting with 73126af098db408bc4653e5b46bc14137de0de5e06d40c579638455c34dd4108 not found: ID does not exist" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.927633 4696 scope.go:117] "RemoveContainer" containerID="ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116" Dec 03 00:16:57 crc kubenswrapper[4696]: E1203 00:16:57.927952 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116\": container with ID starting with ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116 not found: ID does not exist" containerID="ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.927985 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116"} err="failed to get container status \"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116\": rpc error: code = NotFound desc = could not find container \"ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116\": container with ID starting with ad75b1bf28574cae626e24eddad4d8c5cb2b282253797069ab512a4193416116 not found: ID does not exist" Dec 03 00:16:57 crc kubenswrapper[4696]: I1203 00:16:57.961298 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f800a297-5f59-4fae-9a3a-b326cc8a29e4" (UID: "f800a297-5f59-4fae-9a3a-b326cc8a29e4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:58 crc kubenswrapper[4696]: I1203 00:16:58.056056 4696 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f800a297-5f59-4fae-9a3a-b326cc8a29e4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:59 crc kubenswrapper[4696]: I1203 00:16:59.451147 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" path="/var/lib/kubelet/pods/f800a297-5f59-4fae-9a3a-b326cc8a29e4/volumes" Dec 03 00:17:39 crc kubenswrapper[4696]: I1203 00:17:39.509972 4696 scope.go:117] "RemoveContainer" containerID="720a12430aa7ec09e3e7ece4379704a8607d4a03fcd5a6572a84f519ae961b2b" Dec 03 00:17:52 crc kubenswrapper[4696]: I1203 00:17:52.974577 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:17:52 crc kubenswrapper[4696]: I1203 00:17:52.975321 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:18:22 crc kubenswrapper[4696]: I1203 00:18:22.973882 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:18:22 crc kubenswrapper[4696]: I1203 00:18:22.974715 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:18:39 crc kubenswrapper[4696]: I1203 00:18:39.601452 4696 scope.go:117] "RemoveContainer" containerID="7b4eb6ab2bde2dd523b784a45bf6c962c7fa2f3f5f2909322b77165ee09d4162" Dec 03 00:18:39 crc kubenswrapper[4696]: I1203 00:18:39.631241 4696 scope.go:117] "RemoveContainer" containerID="c13148787d81caf472c56006e3fc8829135170a5cb47efffe782132a3235f6e6" Dec 03 00:18:39 crc kubenswrapper[4696]: I1203 00:18:39.685580 4696 scope.go:117] "RemoveContainer" containerID="780e40d51476365b86a3d73419edae58e223810d51d908cbe95c60e9e13df9da" Dec 03 00:18:52 crc kubenswrapper[4696]: I1203 00:18:52.974015 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:18:52 crc kubenswrapper[4696]: I1203 00:18:52.975000 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:18:52 crc kubenswrapper[4696]: I1203 00:18:52.975073 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 03 00:18:52 crc kubenswrapper[4696]: I1203 00:18:52.976223 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:18:52 crc kubenswrapper[4696]: I1203 00:18:52.976298 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000" gracePeriod=600 Dec 03 00:18:53 crc kubenswrapper[4696]: I1203 00:18:53.256590 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000" exitCode=0 Dec 03 00:18:53 crc kubenswrapper[4696]: I1203 00:18:53.256658 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000"} Dec 03 00:18:53 crc kubenswrapper[4696]: I1203 00:18:53.256772 4696 scope.go:117] "RemoveContainer" containerID="c73d4f21ed0c1de0df7ef84929a3066f92fdce038bec491807959001f24c4048" Dec 03 00:18:54 crc kubenswrapper[4696]: I1203 00:18:54.271130 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077"} Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.418309 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:18 crc kubenswrapper[4696]: E1203 00:19:18.419985 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e360f-8129-40fb-8757-b5e061445745" containerName="collect-profiles" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420011 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e360f-8129-40fb-8757-b5e061445745" containerName="collect-profiles" Dec 03 00:19:18 crc kubenswrapper[4696]: E1203 00:19:18.420055 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="gather" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420070 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="gather" Dec 03 00:19:18 crc kubenswrapper[4696]: E1203 00:19:18.420118 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="copy" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420129 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="copy" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420481 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="gather" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420515 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="f800a297-5f59-4fae-9a3a-b326cc8a29e4" containerName="copy" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.420530 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e360f-8129-40fb-8757-b5e061445745" containerName="collect-profiles" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.423583 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.479804 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.502241 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjq5q\" (UniqueName: \"kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.502312 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.502470 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.607313 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.607879 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.608110 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjq5q\" (UniqueName: \"kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.609269 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.609678 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.637958 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjq5q\" (UniqueName: \"kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q\") pod \"certified-operators-vbqg2\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:18 crc kubenswrapper[4696]: I1203 00:19:18.773073 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:19 crc kubenswrapper[4696]: I1203 00:19:19.313192 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:19 crc kubenswrapper[4696]: I1203 00:19:19.568447 4696 generic.go:334] "Generic (PLEG): container finished" podID="94876c41-1f8e-4287-be58-19b9376e71db" containerID="33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8" exitCode=0 Dec 03 00:19:19 crc kubenswrapper[4696]: I1203 00:19:19.568508 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerDied","Data":"33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8"} Dec 03 00:19:19 crc kubenswrapper[4696]: I1203 00:19:19.568546 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerStarted","Data":"33227a43005616495274b109ed9389afc9494a95c505e8876f5889dd55d2d282"} Dec 03 00:19:19 crc kubenswrapper[4696]: I1203 00:19:19.570760 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:19:20 crc kubenswrapper[4696]: I1203 00:19:20.583829 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerStarted","Data":"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535"} Dec 03 00:19:21 crc kubenswrapper[4696]: I1203 00:19:21.594443 4696 generic.go:334] "Generic (PLEG): container finished" podID="94876c41-1f8e-4287-be58-19b9376e71db" containerID="4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535" exitCode=0 Dec 03 00:19:21 crc kubenswrapper[4696]: I1203 00:19:21.594564 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerDied","Data":"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535"} Dec 03 00:19:22 crc kubenswrapper[4696]: I1203 00:19:22.619127 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerStarted","Data":"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791"} Dec 03 00:19:22 crc kubenswrapper[4696]: I1203 00:19:22.679101 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vbqg2" podStartSLOduration=2.245447733 podStartE2EDuration="4.679074797s" podCreationTimestamp="2025-12-03 00:19:18 +0000 UTC" firstStartedPulling="2025-12-03 00:19:19.570435117 +0000 UTC m=+5822.451115118" lastFinishedPulling="2025-12-03 00:19:22.004062191 +0000 UTC m=+5824.884742182" observedRunningTime="2025-12-03 00:19:22.662384195 +0000 UTC m=+5825.543064186" watchObservedRunningTime="2025-12-03 00:19:22.679074797 +0000 UTC m=+5825.559754818" Dec 03 00:19:28 crc kubenswrapper[4696]: I1203 00:19:28.774231 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:28 crc kubenswrapper[4696]: I1203 00:19:28.775132 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:29 crc kubenswrapper[4696]: I1203 00:19:29.116708 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:29 crc kubenswrapper[4696]: I1203 00:19:29.756377 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:29 crc kubenswrapper[4696]: I1203 00:19:29.816765 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:31 crc kubenswrapper[4696]: I1203 00:19:31.708245 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vbqg2" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="registry-server" containerID="cri-o://8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791" gracePeriod=2 Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.234457 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.404118 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content\") pod \"94876c41-1f8e-4287-be58-19b9376e71db\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.404494 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities\") pod \"94876c41-1f8e-4287-be58-19b9376e71db\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.404663 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjq5q\" (UniqueName: \"kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q\") pod \"94876c41-1f8e-4287-be58-19b9376e71db\" (UID: \"94876c41-1f8e-4287-be58-19b9376e71db\") " Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.405956 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities" (OuterVolumeSpecName: "utilities") pod "94876c41-1f8e-4287-be58-19b9376e71db" (UID: "94876c41-1f8e-4287-be58-19b9376e71db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.412343 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q" (OuterVolumeSpecName: "kube-api-access-mjq5q") pod "94876c41-1f8e-4287-be58-19b9376e71db" (UID: "94876c41-1f8e-4287-be58-19b9376e71db"). InnerVolumeSpecName "kube-api-access-mjq5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.466291 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94876c41-1f8e-4287-be58-19b9376e71db" (UID: "94876c41-1f8e-4287-be58-19b9376e71db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.507988 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.508034 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94876c41-1f8e-4287-be58-19b9376e71db-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.508049 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjq5q\" (UniqueName: \"kubernetes.io/projected/94876c41-1f8e-4287-be58-19b9376e71db-kube-api-access-mjq5q\") on node \"crc\" DevicePath \"\"" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.722142 4696 generic.go:334] "Generic (PLEG): container finished" podID="94876c41-1f8e-4287-be58-19b9376e71db" containerID="8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791" exitCode=0 Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.722233 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerDied","Data":"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791"} Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.722315 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbqg2" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.722360 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbqg2" event={"ID":"94876c41-1f8e-4287-be58-19b9376e71db","Type":"ContainerDied","Data":"33227a43005616495274b109ed9389afc9494a95c505e8876f5889dd55d2d282"} Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.722395 4696 scope.go:117] "RemoveContainer" containerID="8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.748543 4696 scope.go:117] "RemoveContainer" containerID="4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.760446 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.773889 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vbqg2"] Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.781462 4696 scope.go:117] "RemoveContainer" containerID="33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.842848 4696 scope.go:117] "RemoveContainer" containerID="8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791" Dec 03 00:19:32 crc kubenswrapper[4696]: E1203 00:19:32.843424 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791\": container with ID starting with 8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791 not found: ID does not exist" containerID="8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.843470 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791"} err="failed to get container status \"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791\": rpc error: code = NotFound desc = could not find container \"8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791\": container with ID starting with 8475183aa2ea2a74aa8517f271f54effa5b76794351f07a4e278d137f8e14791 not found: ID does not exist" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.843503 4696 scope.go:117] "RemoveContainer" containerID="4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535" Dec 03 00:19:32 crc kubenswrapper[4696]: E1203 00:19:32.844006 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535\": container with ID starting with 4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535 not found: ID does not exist" containerID="4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.844067 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535"} err="failed to get container status \"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535\": rpc error: code = NotFound desc = could not find container \"4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535\": container with ID starting with 4dddf83c50c044604357a5a58d9338b11a19e3afb8411bf34feee5787af09535 not found: ID does not exist" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.844106 4696 scope.go:117] "RemoveContainer" containerID="33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8" Dec 03 00:19:32 crc kubenswrapper[4696]: E1203 00:19:32.844505 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8\": container with ID starting with 33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8 not found: ID does not exist" containerID="33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8" Dec 03 00:19:32 crc kubenswrapper[4696]: I1203 00:19:32.844544 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8"} err="failed to get container status \"33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8\": rpc error: code = NotFound desc = could not find container \"33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8\": container with ID starting with 33e81684966a6f4d39c5efb04e86ce9bd5290556106eeb33674831b2752c22f8 not found: ID does not exist" Dec 03 00:19:33 crc kubenswrapper[4696]: I1203 00:19:33.445480 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94876c41-1f8e-4287-be58-19b9376e71db" path="/var/lib/kubelet/pods/94876c41-1f8e-4287-be58-19b9376e71db/volumes" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.208171 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-85248/must-gather-mjfr4"] Dec 03 00:20:00 crc kubenswrapper[4696]: E1203 00:20:00.209546 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="extract-content" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.209566 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="extract-content" Dec 03 00:20:00 crc kubenswrapper[4696]: E1203 00:20:00.209580 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="registry-server" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.209588 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="registry-server" Dec 03 00:20:00 crc kubenswrapper[4696]: E1203 00:20:00.209631 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="extract-utilities" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.209640 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="extract-utilities" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.211693 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="94876c41-1f8e-4287-be58-19b9376e71db" containerName="registry-server" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.213222 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.215568 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-85248"/"kube-root-ca.crt" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.215827 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-85248"/"openshift-service-ca.crt" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.239479 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-85248/must-gather-mjfr4"] Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.380413 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25khs\" (UniqueName: \"kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.380943 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.484159 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25khs\" (UniqueName: \"kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.484513 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.485176 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.506783 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25khs\" (UniqueName: \"kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs\") pod \"must-gather-mjfr4\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:00 crc kubenswrapper[4696]: I1203 00:20:00.547583 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:20:01 crc kubenswrapper[4696]: I1203 00:20:01.064424 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-85248/must-gather-mjfr4"] Dec 03 00:20:02 crc kubenswrapper[4696]: I1203 00:20:02.032068 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/must-gather-mjfr4" event={"ID":"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a","Type":"ContainerStarted","Data":"fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec"} Dec 03 00:20:02 crc kubenswrapper[4696]: I1203 00:20:02.032143 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/must-gather-mjfr4" event={"ID":"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a","Type":"ContainerStarted","Data":"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff"} Dec 03 00:20:02 crc kubenswrapper[4696]: I1203 00:20:02.032156 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/must-gather-mjfr4" event={"ID":"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a","Type":"ContainerStarted","Data":"5540062498d7ff5ff1db68e1806d7a98716b763e149e0bbc392c9a07191e4c52"} Dec 03 00:20:02 crc kubenswrapper[4696]: I1203 00:20:02.057832 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-85248/must-gather-mjfr4" podStartSLOduration=2.057804838 podStartE2EDuration="2.057804838s" podCreationTimestamp="2025-12-03 00:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:20:02.049486413 +0000 UTC m=+5864.930166434" watchObservedRunningTime="2025-12-03 00:20:02.057804838 +0000 UTC m=+5864.938484839" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.030366 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-85248/crc-debug-rchcq"] Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.033663 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.036383 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-85248"/"default-dockercfg-5b7kj" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.221248 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.221684 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbq5b\" (UniqueName: \"kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.323737 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.323998 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.324483 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbq5b\" (UniqueName: \"kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.352989 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbq5b\" (UniqueName: \"kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b\") pod \"crc-debug-rchcq\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:06 crc kubenswrapper[4696]: I1203 00:20:06.366279 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:07 crc kubenswrapper[4696]: I1203 00:20:07.087553 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-rchcq" event={"ID":"2fae8a02-6689-4f86-915e-2b5b0b9f1d85","Type":"ContainerStarted","Data":"f2c1ac2b8a20135badaa3fccd1c2aa56f295871b700907caf8e205237c5fbeaf"} Dec 03 00:20:07 crc kubenswrapper[4696]: I1203 00:20:07.088469 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-rchcq" event={"ID":"2fae8a02-6689-4f86-915e-2b5b0b9f1d85","Type":"ContainerStarted","Data":"8cec4da42be40e9e4f4c9c3e0fda437d5cd07a8ab58bc335677c3a9d48a9adc6"} Dec 03 00:20:07 crc kubenswrapper[4696]: I1203 00:20:07.121284 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-85248/crc-debug-rchcq" podStartSLOduration=1.121210906 podStartE2EDuration="1.121210906s" podCreationTimestamp="2025-12-03 00:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:20:07.110203825 +0000 UTC m=+5869.990883816" watchObservedRunningTime="2025-12-03 00:20:07.121210906 +0000 UTC m=+5870.001890907" Dec 03 00:20:56 crc kubenswrapper[4696]: I1203 00:20:56.608725 4696 generic.go:334] "Generic (PLEG): container finished" podID="2fae8a02-6689-4f86-915e-2b5b0b9f1d85" containerID="f2c1ac2b8a20135badaa3fccd1c2aa56f295871b700907caf8e205237c5fbeaf" exitCode=0 Dec 03 00:20:56 crc kubenswrapper[4696]: I1203 00:20:56.608910 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-rchcq" event={"ID":"2fae8a02-6689-4f86-915e-2b5b0b9f1d85","Type":"ContainerDied","Data":"f2c1ac2b8a20135badaa3fccd1c2aa56f295871b700907caf8e205237c5fbeaf"} Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.775909 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.818431 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-85248/crc-debug-rchcq"] Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.827725 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-85248/crc-debug-rchcq"] Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.881446 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbq5b\" (UniqueName: \"kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b\") pod \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.881624 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host\") pod \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\" (UID: \"2fae8a02-6689-4f86-915e-2b5b0b9f1d85\") " Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.881763 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host" (OuterVolumeSpecName: "host") pod "2fae8a02-6689-4f86-915e-2b5b0b9f1d85" (UID: "2fae8a02-6689-4f86-915e-2b5b0b9f1d85"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.882111 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.898821 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b" (OuterVolumeSpecName: "kube-api-access-dbq5b") pod "2fae8a02-6689-4f86-915e-2b5b0b9f1d85" (UID: "2fae8a02-6689-4f86-915e-2b5b0b9f1d85"). InnerVolumeSpecName "kube-api-access-dbq5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:20:57 crc kubenswrapper[4696]: I1203 00:20:57.983912 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbq5b\" (UniqueName: \"kubernetes.io/projected/2fae8a02-6689-4f86-915e-2b5b0b9f1d85-kube-api-access-dbq5b\") on node \"crc\" DevicePath \"\"" Dec 03 00:20:58 crc kubenswrapper[4696]: I1203 00:20:58.631577 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cec4da42be40e9e4f4c9c3e0fda437d5cd07a8ab58bc335677c3a9d48a9adc6" Dec 03 00:20:58 crc kubenswrapper[4696]: I1203 00:20:58.631664 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-rchcq" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.037397 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-85248/crc-debug-bxtxv"] Dec 03 00:20:59 crc kubenswrapper[4696]: E1203 00:20:59.039720 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fae8a02-6689-4f86-915e-2b5b0b9f1d85" containerName="container-00" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.039763 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fae8a02-6689-4f86-915e-2b5b0b9f1d85" containerName="container-00" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.039999 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fae8a02-6689-4f86-915e-2b5b0b9f1d85" containerName="container-00" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.042706 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.045286 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-85248"/"default-dockercfg-5b7kj" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.210569 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.210672 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdh2\" (UniqueName: \"kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.313081 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdh2\" (UniqueName: \"kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.313339 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.313464 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.337799 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdh2\" (UniqueName: \"kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2\") pod \"crc-debug-bxtxv\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.366078 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.468471 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fae8a02-6689-4f86-915e-2b5b0b9f1d85" path="/var/lib/kubelet/pods/2fae8a02-6689-4f86-915e-2b5b0b9f1d85/volumes" Dec 03 00:20:59 crc kubenswrapper[4696]: I1203 00:20:59.645372 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-bxtxv" event={"ID":"7ca7a139-0cca-4d0d-bda1-36637056a4f2","Type":"ContainerStarted","Data":"ea2aa4ce29d8d09ec8aff808ec3412f19e9e9668e5142282b4db6d82c81ccbdc"} Dec 03 00:21:00 crc kubenswrapper[4696]: I1203 00:21:00.655869 4696 generic.go:334] "Generic (PLEG): container finished" podID="7ca7a139-0cca-4d0d-bda1-36637056a4f2" containerID="0b62c3d535756f8b952fb5b4f775e98cc65784afcf2f0c3153dc94b3b9e3c5bb" exitCode=0 Dec 03 00:21:00 crc kubenswrapper[4696]: I1203 00:21:00.656079 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-bxtxv" event={"ID":"7ca7a139-0cca-4d0d-bda1-36637056a4f2","Type":"ContainerDied","Data":"0b62c3d535756f8b952fb5b4f775e98cc65784afcf2f0c3153dc94b3b9e3c5bb"} Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.808518 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.966579 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host\") pod \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.966651 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host" (OuterVolumeSpecName: "host") pod "7ca7a139-0cca-4d0d-bda1-36637056a4f2" (UID: "7ca7a139-0cca-4d0d-bda1-36637056a4f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.966825 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxdh2\" (UniqueName: \"kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2\") pod \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\" (UID: \"7ca7a139-0cca-4d0d-bda1-36637056a4f2\") " Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.967501 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ca7a139-0cca-4d0d-bda1-36637056a4f2-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:21:01 crc kubenswrapper[4696]: I1203 00:21:01.980112 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2" (OuterVolumeSpecName: "kube-api-access-qxdh2") pod "7ca7a139-0cca-4d0d-bda1-36637056a4f2" (UID: "7ca7a139-0cca-4d0d-bda1-36637056a4f2"). InnerVolumeSpecName "kube-api-access-qxdh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:21:02 crc kubenswrapper[4696]: I1203 00:21:02.069028 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxdh2\" (UniqueName: \"kubernetes.io/projected/7ca7a139-0cca-4d0d-bda1-36637056a4f2-kube-api-access-qxdh2\") on node \"crc\" DevicePath \"\"" Dec 03 00:21:02 crc kubenswrapper[4696]: I1203 00:21:02.684551 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-bxtxv" event={"ID":"7ca7a139-0cca-4d0d-bda1-36637056a4f2","Type":"ContainerDied","Data":"ea2aa4ce29d8d09ec8aff808ec3412f19e9e9668e5142282b4db6d82c81ccbdc"} Dec 03 00:21:02 crc kubenswrapper[4696]: I1203 00:21:02.684965 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea2aa4ce29d8d09ec8aff808ec3412f19e9e9668e5142282b4db6d82c81ccbdc" Dec 03 00:21:02 crc kubenswrapper[4696]: I1203 00:21:02.684858 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-bxtxv" Dec 03 00:21:03 crc kubenswrapper[4696]: I1203 00:21:03.288172 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-85248/crc-debug-bxtxv"] Dec 03 00:21:03 crc kubenswrapper[4696]: I1203 00:21:03.304938 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-85248/crc-debug-bxtxv"] Dec 03 00:21:03 crc kubenswrapper[4696]: I1203 00:21:03.444421 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca7a139-0cca-4d0d-bda1-36637056a4f2" path="/var/lib/kubelet/pods/7ca7a139-0cca-4d0d-bda1-36637056a4f2/volumes" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.489079 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-85248/crc-debug-5xd8b"] Dec 03 00:21:04 crc kubenswrapper[4696]: E1203 00:21:04.489577 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca7a139-0cca-4d0d-bda1-36637056a4f2" containerName="container-00" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.489595 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7a139-0cca-4d0d-bda1-36637056a4f2" containerName="container-00" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.489807 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca7a139-0cca-4d0d-bda1-36637056a4f2" containerName="container-00" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.490554 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.492895 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-85248"/"default-dockercfg-5b7kj" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.627261 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.627369 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djw9d\" (UniqueName: \"kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.730447 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.730647 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.730931 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djw9d\" (UniqueName: \"kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.758375 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djw9d\" (UniqueName: \"kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d\") pod \"crc-debug-5xd8b\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:04 crc kubenswrapper[4696]: I1203 00:21:04.821142 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:05 crc kubenswrapper[4696]: I1203 00:21:05.735139 4696 generic.go:334] "Generic (PLEG): container finished" podID="a791a193-528d-4d3e-81d9-101299290965" containerID="4311f5b79c0fe9596969db5c1357795fcdcbc4efd87b61365908ec2677dcb6a9" exitCode=0 Dec 03 00:21:05 crc kubenswrapper[4696]: I1203 00:21:05.735212 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-5xd8b" event={"ID":"a791a193-528d-4d3e-81d9-101299290965","Type":"ContainerDied","Data":"4311f5b79c0fe9596969db5c1357795fcdcbc4efd87b61365908ec2677dcb6a9"} Dec 03 00:21:05 crc kubenswrapper[4696]: I1203 00:21:05.735252 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/crc-debug-5xd8b" event={"ID":"a791a193-528d-4d3e-81d9-101299290965","Type":"ContainerStarted","Data":"f99801c4839f6ed0fac2c15c206a7bbf0cfecdb9fd039f4399eba4ea9feed6ee"} Dec 03 00:21:05 crc kubenswrapper[4696]: I1203 00:21:05.880431 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-85248/crc-debug-5xd8b"] Dec 03 00:21:05 crc kubenswrapper[4696]: I1203 00:21:05.891644 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-85248/crc-debug-5xd8b"] Dec 03 00:21:06 crc kubenswrapper[4696]: I1203 00:21:06.879772 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:06 crc kubenswrapper[4696]: I1203 00:21:06.998970 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djw9d\" (UniqueName: \"kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d\") pod \"a791a193-528d-4d3e-81d9-101299290965\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " Dec 03 00:21:06 crc kubenswrapper[4696]: I1203 00:21:06.999222 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host\") pod \"a791a193-528d-4d3e-81d9-101299290965\" (UID: \"a791a193-528d-4d3e-81d9-101299290965\") " Dec 03 00:21:06 crc kubenswrapper[4696]: I1203 00:21:06.999348 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host" (OuterVolumeSpecName: "host") pod "a791a193-528d-4d3e-81d9-101299290965" (UID: "a791a193-528d-4d3e-81d9-101299290965"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.000105 4696 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a791a193-528d-4d3e-81d9-101299290965-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.008828 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d" (OuterVolumeSpecName: "kube-api-access-djw9d") pod "a791a193-528d-4d3e-81d9-101299290965" (UID: "a791a193-528d-4d3e-81d9-101299290965"). InnerVolumeSpecName "kube-api-access-djw9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.102379 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djw9d\" (UniqueName: \"kubernetes.io/projected/a791a193-528d-4d3e-81d9-101299290965-kube-api-access-djw9d\") on node \"crc\" DevicePath \"\"" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.445914 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a791a193-528d-4d3e-81d9-101299290965" path="/var/lib/kubelet/pods/a791a193-528d-4d3e-81d9-101299290965/volumes" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.756119 4696 scope.go:117] "RemoveContainer" containerID="4311f5b79c0fe9596969db5c1357795fcdcbc4efd87b61365908ec2677dcb6a9" Dec 03 00:21:07 crc kubenswrapper[4696]: I1203 00:21:07.756318 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/crc-debug-5xd8b" Dec 03 00:21:22 crc kubenswrapper[4696]: I1203 00:21:22.973868 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:21:22 crc kubenswrapper[4696]: I1203 00:21:22.974630 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.233873 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-748c8487f8-gqxg9_4c68604e-222e-4a20-b829-c2f4e3c6923e/barbican-api/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.456959 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d6f88f57d-fkfhk_6a2e48f4-820d-4199-883c-f7d93f5f12c6/barbican-keystone-listener/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.468882 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-748c8487f8-gqxg9_4c68604e-222e-4a20-b829-c2f4e3c6923e/barbican-api-log/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.594573 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d6f88f57d-fkfhk_6a2e48f4-820d-4199-883c-f7d93f5f12c6/barbican-keystone-listener-log/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.693013 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95b48849f-64t8k_314505e6-5f55-4c07-9692-c5698c6e3ff1/barbican-worker/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.710023 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-95b48849f-64t8k_314505e6-5f55-4c07-9692-c5698c6e3ff1/barbican-worker-log/0.log" Dec 03 00:21:39 crc kubenswrapper[4696]: I1203 00:21:39.991226 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rmcht_10db9578-c367-420b-ba4f-93729e4d9483/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.104233 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/ceilometer-central-agent/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.176362 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/ceilometer-notification-agent/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.258274 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/proxy-httpd/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.318063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0947b5ae-aeca-481d-a2b9-3bd3db5a33c0/sg-core/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.477439 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e0a0bd09-55c1-4eb0-bed1-76a920e67875/cinder-api/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.518268 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e0a0bd09-55c1-4eb0-bed1-76a920e67875/cinder-api-log/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.697078 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29412001-jvcl6_f9b76748-e694-4766-a355-d01c0fc857e0/cinder-db-purge/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.822560 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_17a7a7a1-8e9c-4b77-8a09-783b8b465cf5/cinder-scheduler/0.log" Dec 03 00:21:40 crc kubenswrapper[4696]: I1203 00:21:40.877552 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_17a7a7a1-8e9c-4b77-8a09-783b8b465cf5/probe/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.000435 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vlcsp_38969cda-9a4e-4bf7-bbc0-2a6a1849bf2e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.153128 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d8wqz_4e57d59f-2b48-457e-92dd-d0585bab85b5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.236321 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/init/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.463117 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/init/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.562789 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p4zx2_a29810cf-fd6b-4021-8ae5-52612fb63cfc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.622033 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-2zjwn_2300da34-d1de-4f62-a360-4d9cb16d48b7/dnsmasq-dns/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.789939 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29412001-htr7n_2cc2c412-c05d-4914-b5a4-e0a0e40b8a59/glance-dbpurge/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.857617 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3e14f14-0774-4eb2-aff3-231d72e6136f/glance-httpd/0.log" Dec 03 00:21:41 crc kubenswrapper[4696]: I1203 00:21:41.955803 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3e14f14-0774-4eb2-aff3-231d72e6136f/glance-log/0.log" Dec 03 00:21:42 crc kubenswrapper[4696]: I1203 00:21:42.100474 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7d71c8ce-55a6-4bbc-a450-128443762f36/glance-httpd/0.log" Dec 03 00:21:42 crc kubenswrapper[4696]: I1203 00:21:42.114248 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7d71c8ce-55a6-4bbc-a450-128443762f36/glance-log/0.log" Dec 03 00:21:42 crc kubenswrapper[4696]: I1203 00:21:42.435758 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c784657c6-hdbrw_b414fc10-9d51-456b-aaa9-d6b4dd08af99/horizon/0.log" Dec 03 00:21:42 crc kubenswrapper[4696]: I1203 00:21:42.521919 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-l8j4b_d3d600ee-2200-406f-8d8b-f093851161fd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:42 crc kubenswrapper[4696]: I1203 00:21:42.823841 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2sw6r_f616e70d-4131-4ed5-b891-33dcad6a8827/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:43 crc kubenswrapper[4696]: I1203 00:21:43.065084 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412001-b2pg6_c211c59c-65a7-4672-8a9e-7b9d20220ef5/keystone-cron/0.log" Dec 03 00:21:43 crc kubenswrapper[4696]: I1203 00:21:43.088847 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c784657c6-hdbrw_b414fc10-9d51-456b-aaa9-d6b4dd08af99/horizon-log/0.log" Dec 03 00:21:43 crc kubenswrapper[4696]: I1203 00:21:43.317538 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c200fd15-55ea-4c23-a8d4-22c362deedee/kube-state-metrics/0.log" Dec 03 00:21:43 crc kubenswrapper[4696]: I1203 00:21:43.384561 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-526g5_5697ae7a-9589-4939-a3a5-5613ee6094ab/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:43 crc kubenswrapper[4696]: I1203 00:21:43.436217 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d45f5d76-ptzqb_6cc29833-0849-46ec-bc06-1c980ec2dc02/keystone-api/0.log" Dec 03 00:21:44 crc kubenswrapper[4696]: I1203 00:21:44.187430 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67dbcf9bdf-2hr7m_797ad679-555c-4599-bc0c-21c0254a3a5a/neutron-httpd/0.log" Dec 03 00:21:44 crc kubenswrapper[4696]: I1203 00:21:44.199677 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-f94wg_81a46783-f2f6-464b-a1cd-d859d59e0c99/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:44 crc kubenswrapper[4696]: I1203 00:21:44.285800 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67dbcf9bdf-2hr7m_797ad679-555c-4599-bc0c-21c0254a3a5a/neutron-api/0.log" Dec 03 00:21:44 crc kubenswrapper[4696]: I1203 00:21:44.975104 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f458554-1460-4379-95af-2313d4df2320/nova-cell0-conductor-conductor/0.log" Dec 03 00:21:45 crc kubenswrapper[4696]: I1203 00:21:45.082979 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29412000-ws52s_8f44b5e9-136f-4cba-9f87-6bb1d73fb496/nova-manage/0.log" Dec 03 00:21:45 crc kubenswrapper[4696]: I1203 00:21:45.576888 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4c7be3d4-52ad-4671-8b48-8cc19cf98b4c/nova-cell1-conductor-conductor/0.log" Dec 03 00:21:45 crc kubenswrapper[4696]: I1203 00:21:45.660635 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6cb033f1-9348-4822-b022-daef2e06af49/nova-api-log/0.log" Dec 03 00:21:45 crc kubenswrapper[4696]: I1203 00:21:45.778126 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29412000-fd8ck_0a2082a5-4293-40c5-ad8d-bb7a4bc43626/nova-manage/0.log" Dec 03 00:21:46 crc kubenswrapper[4696]: I1203 00:21:46.312464 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2ed66bd7-4f5d-4501-b81f-51939db42c64/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 00:21:46 crc kubenswrapper[4696]: I1203 00:21:46.332178 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6cb033f1-9348-4822-b022-daef2e06af49/nova-api-api/0.log" Dec 03 00:21:46 crc kubenswrapper[4696]: I1203 00:21:46.480650 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-2dd9x_7b36c51f-9889-4191-a7ea-b54a79542e0b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:46 crc kubenswrapper[4696]: I1203 00:21:46.708466 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_60554cd9-644e-40c0-90c9-57610b92846e/nova-metadata-log/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.073831 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/mysql-bootstrap/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.248132 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_90b9eaeb-1865-4418-8665-1e65f0fb8151/nova-scheduler-scheduler/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.314409 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/mysql-bootstrap/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.358918 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dff19c4d-2106-4034-8c29-39429553a062/galera/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.602080 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/mysql-bootstrap/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.808883 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/galera/0.log" Dec 03 00:21:47 crc kubenswrapper[4696]: I1203 00:21:47.822261 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_af9932e1-c721-45b3-a213-93da4e130d05/mysql-bootstrap/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.056373 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3e0a050b-c652-4ef2-8f1a-19c8f4732a0c/openstackclient/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.255304 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-btsm6_b13b6998-c04a-4ac8-9615-5078f1169ecb/ovn-controller/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.320458 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-f6gwc_5f108f43-c0d4-4026-9f97-3a2fc3698626/openstack-network-exporter/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.573781 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server-init/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.795310 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovs-vswitchd/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.823677 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server-init/0.log" Dec 03 00:21:48 crc kubenswrapper[4696]: I1203 00:21:48.852360 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h54st_d004daa2-5ad8-49b8-9f27-cc0552d409de/ovsdb-server/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.083548 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-d7r49_9ac484be-e201-4b74-a21e-502131efc1e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.349715 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_60554cd9-644e-40c0-90c9-57610b92846e/nova-metadata-metadata/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.652315 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87dfe190-5b7f-48c2-bfa0-97ca227eabb2/ovn-northd/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.684352 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87dfe190-5b7f-48c2-bfa0-97ca227eabb2/openstack-network-exporter/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.794180 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd18044c-bd73-4166-83ac-e555f2a587b3/openstack-network-exporter/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.920148 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd18044c-bd73-4166-83ac-e555f2a587b3/ovsdbserver-nb/0.log" Dec 03 00:21:49 crc kubenswrapper[4696]: I1203 00:21:49.946049 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd863731-0190-4818-90bb-a7b5b781e616/openstack-network-exporter/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.070097 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd863731-0190-4818-90bb-a7b5b781e616/ovsdbserver-sb/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.465964 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599746d6dd-mg2dx_fe44184e-95f9-4a2e-a6a4-e2534c44e933/placement-api/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.502652 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/init-config-reloader/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.547322 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599746d6dd-mg2dx_fe44184e-95f9-4a2e-a6a4-e2534c44e933/placement-log/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.710439 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/init-config-reloader/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.715635 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/config-reloader/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.751775 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/prometheus/0.log" Dec 03 00:21:50 crc kubenswrapper[4696]: I1203 00:21:50.831213 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ef5ae851-44bb-46fa-9245-abc5b46b1771/thanos-sidecar/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.381707 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/setup-container/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.597215 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/setup-container/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.621284 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fcde5666-44ba-4867-a0ed-afb36ecfafc9/rabbitmq/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.704168 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/setup-container/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.962515 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/rabbitmq/0.log" Dec 03 00:21:51 crc kubenswrapper[4696]: I1203 00:21:51.966462 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fbc07453-3ac7-469b-ab0e-23ca695250e6/setup-container/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.069888 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sttwb_00408801-09ea-4d50-a657-b01117a2f51b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.566350 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ndfd4_d6b40a87-ecaf-4f50-a3e0-04235ce0f029/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.570407 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fjhhb_ba5e6341-d5c7-41b8-adf8-59f3036d3838/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.748861 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6qlp7_96d33f70-c859-4df1-9e0c-94fa64d60a41/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.868944 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dkdq9_540fd942-5964-4e7f-a40f-66102876bd8c/ssh-known-hosts-edpm-deployment/0.log" Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.973381 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:21:52 crc kubenswrapper[4696]: I1203 00:21:52.973451 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.141867 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5964f98dd9-7q2kj_a76ff35f-36d6-48df-94ed-337199547cd5/proxy-server/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.357443 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kq468_955c99b3-ad42-4e65-a391-47eda1c4130a/swift-ring-rebalance/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.363231 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5964f98dd9-7q2kj_a76ff35f-36d6-48df-94ed-337199547cd5/proxy-httpd/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.465383 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-auditor/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.566207 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-reaper/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.648211 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-replicator/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.725201 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/account-server/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.794968 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-auditor/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.819082 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-replicator/0.log" Dec 03 00:21:53 crc kubenswrapper[4696]: I1203 00:21:53.920446 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-server/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.033755 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/container-updater/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.085488 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-expirer/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.140074 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-auditor/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.149100 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-replicator/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.262154 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-server/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.341275 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/object-updater/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.407844 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/swift-recon-cron/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.407913 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d5594f21-8f1d-4105-ad47-c065a9fc468b/rsync/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.676878 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4ljbq_3c9ec356-4712-4484-9b78-9e5d4831dac1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.699022 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4881d1aa-7494-45fe-b21b-5cae7bfe2f41/tempest-tests-tempest-tests-runner/0.log" Dec 03 00:21:54 crc kubenswrapper[4696]: I1203 00:21:54.928760 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_26aca1cb-cb0c-4cf6-8f7d-833b8da6ed88/test-operator-logs-container/0.log" Dec 03 00:21:55 crc kubenswrapper[4696]: I1203 00:21:55.028509 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tfsrw_130877a2-12e6-4731-9f64-675fcfd8a1ce/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:21:56 crc kubenswrapper[4696]: I1203 00:21:56.161032 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_88764a17-d8c0-447f-923a-4afd6c522e43/watcher-applier/0.log" Dec 03 00:21:56 crc kubenswrapper[4696]: I1203 00:21:56.681593 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_3ae28daa-ab18-478f-ac27-6be4b2d632d3/watcher-api-log/0.log" Dec 03 00:21:57 crc kubenswrapper[4696]: I1203 00:21:57.570890 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_9703e3e9-39e6-4c7f-a1ea-324a4f26c18a/watcher-decision-engine/0.log" Dec 03 00:22:00 crc kubenswrapper[4696]: I1203 00:22:00.683751 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_3ae28daa-ab18-478f-ac27-6be4b2d632d3/watcher-api/0.log" Dec 03 00:22:06 crc kubenswrapper[4696]: I1203 00:22:06.595808 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_87c53ac9-38ab-43a7-b99e-29c47a69f818/memcached/0.log" Dec 03 00:22:22 crc kubenswrapper[4696]: I1203 00:22:22.973961 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:22:22 crc kubenswrapper[4696]: I1203 00:22:22.974879 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:22:22 crc kubenswrapper[4696]: I1203 00:22:22.974951 4696 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-chq65" Dec 03 00:22:22 crc kubenswrapper[4696]: I1203 00:22:22.976093 4696 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077"} pod="openshift-machine-config-operator/machine-config-daemon-chq65" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:22:22 crc kubenswrapper[4696]: I1203 00:22:22.976155 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" containerID="cri-o://1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" gracePeriod=600 Dec 03 00:22:23 crc kubenswrapper[4696]: E1203 00:22:23.609552 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:22:23 crc kubenswrapper[4696]: I1203 00:22:23.663175 4696 generic.go:334] "Generic (PLEG): container finished" podID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" exitCode=0 Dec 03 00:22:23 crc kubenswrapper[4696]: I1203 00:22:23.663241 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerDied","Data":"1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077"} Dec 03 00:22:23 crc kubenswrapper[4696]: I1203 00:22:23.663317 4696 scope.go:117] "RemoveContainer" containerID="945ab243f0a97339bf35954545bf3af34070f99574e9ea3276f59a116fa55000" Dec 03 00:22:23 crc kubenswrapper[4696]: I1203 00:22:23.664370 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:22:23 crc kubenswrapper[4696]: E1203 00:22:23.664791 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:22:24 crc kubenswrapper[4696]: I1203 00:22:24.865825 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.036031 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.040130 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.080102 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.267008 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/pull/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.280767 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/util/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.301567 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b44fd46acbfb58b875116701dfb531286e4b634e6faeebc661bfd385ezk8mc_e0a517f4-97e3-42f4-a78c-4ba5b8bfcba7/extract/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.463485 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xxm5w_351a13fb-8e8e-4393-adef-28523ab05ccb/kube-rbac-proxy/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.523483 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xxm5w_351a13fb-8e8e-4393-adef-28523ab05ccb/manager/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.637843 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rjlxh_755f9574-a31b-430c-a2a2-92554020d96b/kube-rbac-proxy/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.772313 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rjlxh_755f9574-a31b-430c-a2a2-92554020d96b/manager/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.830606 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-tzgnf_5706d5c2-8bbe-40b3-8820-0d547363fa96/kube-rbac-proxy/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.871778 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-tzgnf_5706d5c2-8bbe-40b3-8820-0d547363fa96/manager/0.log" Dec 03 00:22:25 crc kubenswrapper[4696]: I1203 00:22:25.987171 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvqf8_e693a226-52c3-413c-b607-c0050ab5e553/kube-rbac-proxy/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.116731 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvqf8_e693a226-52c3-413c-b607-c0050ab5e553/manager/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.263153 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s9pk4_5315d589-3bb7-4776-b842-ffc18e1a89e1/manager/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.291399 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s9pk4_5315d589-3bb7-4776-b842-ffc18e1a89e1/kube-rbac-proxy/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.402117 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tng2n_7d7b7caa-1ec3-4e66-9273-36cae02cbe8e/kube-rbac-proxy/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.495347 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tng2n_7d7b7caa-1ec3-4e66-9273-36cae02cbe8e/manager/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.566502 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kz6bs_66d51ef3-89ba-4653-ae46-5469bfc5232e/kube-rbac-proxy/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.760304 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kz6bs_66d51ef3-89ba-4653-ae46-5469bfc5232e/manager/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.782203 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4crn9_bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094/kube-rbac-proxy/0.log" Dec 03 00:22:26 crc kubenswrapper[4696]: I1203 00:22:26.851887 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4crn9_bbbbf87d-fa5c-4e9b-9bbc-b19ed043e094/manager/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.019976 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-vg9kf_9207b2f0-999a-45e4-8234-982f796f7801/kube-rbac-proxy/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.024074 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-vg9kf_9207b2f0-999a-45e4-8234-982f796f7801/manager/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.452250 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k2p9j_5436ce3c-34d6-47eb-81b1-3b4dc1c2d794/kube-rbac-proxy/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.516087 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k2p9j_5436ce3c-34d6-47eb-81b1-3b4dc1c2d794/manager/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.553194 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tp7td_baad852a-374a-460e-9d5c-cb5418291849/kube-rbac-proxy/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.684909 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tp7td_baad852a-374a-460e-9d5c-cb5418291849/manager/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.755335 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d6qww_1beb3e53-4faf-475f-b5b0-57b8cd32c529/kube-rbac-proxy/0.log" Dec 03 00:22:27 crc kubenswrapper[4696]: I1203 00:22:27.863127 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d6qww_1beb3e53-4faf-475f-b5b0-57b8cd32c529/manager/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.089593 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-hfcc6_9c1744f3-fc58-4653-a7e0-4fcdfdfca485/kube-rbac-proxy/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.131132 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-hfcc6_9c1744f3-fc58-4653-a7e0-4fcdfdfca485/manager/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.250916 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bpk25_060c8046-7775-413d-9797-ef0edcee01dd/kube-rbac-proxy/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.311856 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bpk25_060c8046-7775-413d-9797-ef0edcee01dd/manager/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.362878 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg_6e335d65-9d0f-4ace-97cc-70a4a2bb2291/kube-rbac-proxy/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.481061 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jnwwg_6e335d65-9d0f-4ace-97cc-70a4a2bb2291/manager/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.844261 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-dfb58c988-g96v2_f7e88453-0fd4-401a-92cd-f75809f14f21/operator/0.log" Dec 03 00:22:28 crc kubenswrapper[4696]: I1203 00:22:28.946427 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k25sp_2bbe83e8-36bc-401e-84b6-917b6aeb6398/registry-server/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.150423 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pwblb_77131fa7-a611-46bf-b0fe-d05d909dfd4c/manager/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.180198 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pwblb_77131fa7-a611-46bf-b0fe-d05d909dfd4c/kube-rbac-proxy/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.320624 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-g5pxw_3d6a00c3-b537-414a-8ba4-2797d7bc88f8/kube-rbac-proxy/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.421245 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-g5pxw_3d6a00c3-b537-414a-8ba4-2797d7bc88f8/manager/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.496939 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b2wbs_e53eb416-2701-4080-b0a3-bbeae35013a4/operator/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.692307 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-m472w_9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e/kube-rbac-proxy/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.722533 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-m472w_9ab3fbd9-1610-4d0c-aa5e-2c298e5dcc3e/manager/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.876329 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-744c6b777f-bjtk5_0508daaa-b26a-4f05-9abc-f63ac69fd1d5/manager/0.log" Dec 03 00:22:29 crc kubenswrapper[4696]: I1203 00:22:29.939981 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-58fzc_45f0f590-24f6-4f01-98a0-a41508a59f5a/kube-rbac-proxy/0.log" Dec 03 00:22:30 crc kubenswrapper[4696]: I1203 00:22:30.096539 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-jfcsj_796c18e3-0c33-4393-aba8-2ad03aad4b93/manager/0.log" Dec 03 00:22:30 crc kubenswrapper[4696]: I1203 00:22:30.107026 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-58fzc_45f0f590-24f6-4f01-98a0-a41508a59f5a/manager/0.log" Dec 03 00:22:30 crc kubenswrapper[4696]: I1203 00:22:30.129792 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-jfcsj_796c18e3-0c33-4393-aba8-2ad03aad4b93/kube-rbac-proxy/0.log" Dec 03 00:22:30 crc kubenswrapper[4696]: I1203 00:22:30.244345 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d4477bdf4-lxz2l_d417ecee-aebb-4154-ac0c-2c321bd78182/kube-rbac-proxy/0.log" Dec 03 00:22:30 crc kubenswrapper[4696]: I1203 00:22:30.373336 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-d4477bdf4-lxz2l_d417ecee-aebb-4154-ac0c-2c321bd78182/manager/0.log" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.388148 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:32 crc kubenswrapper[4696]: E1203 00:22:32.390160 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a791a193-528d-4d3e-81d9-101299290965" containerName="container-00" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.390690 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a791a193-528d-4d3e-81d9-101299290965" containerName="container-00" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.391110 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a791a193-528d-4d3e-81d9-101299290965" containerName="container-00" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.393996 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.406862 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.440939 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsfhn\" (UniqueName: \"kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.441328 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.441549 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.544379 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.545186 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.545543 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsfhn\" (UniqueName: \"kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.545695 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.546239 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.571214 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsfhn\" (UniqueName: \"kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn\") pod \"community-operators-hgg2d\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:32 crc kubenswrapper[4696]: I1203 00:22:32.731360 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:33 crc kubenswrapper[4696]: I1203 00:22:33.351327 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:33 crc kubenswrapper[4696]: I1203 00:22:33.792122 4696 generic.go:334] "Generic (PLEG): container finished" podID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerID="d43168ce64ad4b6491f8d986a70d613e80d4741ce177e0d8d81dda4e48f88421" exitCode=0 Dec 03 00:22:33 crc kubenswrapper[4696]: I1203 00:22:33.792208 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerDied","Data":"d43168ce64ad4b6491f8d986a70d613e80d4741ce177e0d8d81dda4e48f88421"} Dec 03 00:22:33 crc kubenswrapper[4696]: I1203 00:22:33.792287 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerStarted","Data":"ed582aad8e4d4ecbf7f40e5c768302464913ff627d9ad52fe6409968597f8738"} Dec 03 00:22:35 crc kubenswrapper[4696]: I1203 00:22:35.431994 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:22:35 crc kubenswrapper[4696]: E1203 00:22:35.435559 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:22:35 crc kubenswrapper[4696]: I1203 00:22:35.811449 4696 generic.go:334] "Generic (PLEG): container finished" podID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerID="b8f886f3bb41a8d569ff64f49eb79b3f0351c48af4d039142f55856183d11a13" exitCode=0 Dec 03 00:22:35 crc kubenswrapper[4696]: I1203 00:22:35.811509 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerDied","Data":"b8f886f3bb41a8d569ff64f49eb79b3f0351c48af4d039142f55856183d11a13"} Dec 03 00:22:36 crc kubenswrapper[4696]: I1203 00:22:36.824032 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerStarted","Data":"c90bbea2847afa2e2d2f3948a0d9ed76b2625f4567bcb5d0243f2f36b0ac5369"} Dec 03 00:22:36 crc kubenswrapper[4696]: I1203 00:22:36.847138 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgg2d" podStartSLOduration=2.272552024 podStartE2EDuration="4.847114926s" podCreationTimestamp="2025-12-03 00:22:32 +0000 UTC" firstStartedPulling="2025-12-03 00:22:33.793789741 +0000 UTC m=+6016.674469742" lastFinishedPulling="2025-12-03 00:22:36.368352643 +0000 UTC m=+6019.249032644" observedRunningTime="2025-12-03 00:22:36.843912196 +0000 UTC m=+6019.724592197" watchObservedRunningTime="2025-12-03 00:22:36.847114926 +0000 UTC m=+6019.727794917" Dec 03 00:22:42 crc kubenswrapper[4696]: I1203 00:22:42.732466 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:42 crc kubenswrapper[4696]: I1203 00:22:42.733314 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:42 crc kubenswrapper[4696]: I1203 00:22:42.787464 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:42 crc kubenswrapper[4696]: I1203 00:22:42.933948 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:43 crc kubenswrapper[4696]: I1203 00:22:43.030531 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:44 crc kubenswrapper[4696]: I1203 00:22:44.899068 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgg2d" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="registry-server" containerID="cri-o://c90bbea2847afa2e2d2f3948a0d9ed76b2625f4567bcb5d0243f2f36b0ac5369" gracePeriod=2 Dec 03 00:22:45 crc kubenswrapper[4696]: I1203 00:22:45.922850 4696 generic.go:334] "Generic (PLEG): container finished" podID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerID="c90bbea2847afa2e2d2f3948a0d9ed76b2625f4567bcb5d0243f2f36b0ac5369" exitCode=0 Dec 03 00:22:45 crc kubenswrapper[4696]: I1203 00:22:45.922918 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerDied","Data":"c90bbea2847afa2e2d2f3948a0d9ed76b2625f4567bcb5d0243f2f36b0ac5369"} Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.110239 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.179648 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content\") pod \"a65078bf-36b0-484a-9392-96fe7c180c2a\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.179765 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities\") pod \"a65078bf-36b0-484a-9392-96fe7c180c2a\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.179826 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsfhn\" (UniqueName: \"kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn\") pod \"a65078bf-36b0-484a-9392-96fe7c180c2a\" (UID: \"a65078bf-36b0-484a-9392-96fe7c180c2a\") " Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.181211 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities" (OuterVolumeSpecName: "utilities") pod "a65078bf-36b0-484a-9392-96fe7c180c2a" (UID: "a65078bf-36b0-484a-9392-96fe7c180c2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.187984 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn" (OuterVolumeSpecName: "kube-api-access-hsfhn") pod "a65078bf-36b0-484a-9392-96fe7c180c2a" (UID: "a65078bf-36b0-484a-9392-96fe7c180c2a"). InnerVolumeSpecName "kube-api-access-hsfhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.246621 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a65078bf-36b0-484a-9392-96fe7c180c2a" (UID: "a65078bf-36b0-484a-9392-96fe7c180c2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.281996 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.282491 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65078bf-36b0-484a-9392-96fe7c180c2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.282510 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsfhn\" (UniqueName: \"kubernetes.io/projected/a65078bf-36b0-484a-9392-96fe7c180c2a-kube-api-access-hsfhn\") on node \"crc\" DevicePath \"\"" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.938498 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgg2d" event={"ID":"a65078bf-36b0-484a-9392-96fe7c180c2a","Type":"ContainerDied","Data":"ed582aad8e4d4ecbf7f40e5c768302464913ff627d9ad52fe6409968597f8738"} Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.938559 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgg2d" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.938577 4696 scope.go:117] "RemoveContainer" containerID="c90bbea2847afa2e2d2f3948a0d9ed76b2625f4567bcb5d0243f2f36b0ac5369" Dec 03 00:22:46 crc kubenswrapper[4696]: I1203 00:22:46.966798 4696 scope.go:117] "RemoveContainer" containerID="b8f886f3bb41a8d569ff64f49eb79b3f0351c48af4d039142f55856183d11a13" Dec 03 00:22:47 crc kubenswrapper[4696]: I1203 00:22:47.010908 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:47 crc kubenswrapper[4696]: I1203 00:22:47.012010 4696 scope.go:117] "RemoveContainer" containerID="d43168ce64ad4b6491f8d986a70d613e80d4741ce177e0d8d81dda4e48f88421" Dec 03 00:22:47 crc kubenswrapper[4696]: I1203 00:22:47.021691 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgg2d"] Dec 03 00:22:47 crc kubenswrapper[4696]: I1203 00:22:47.442211 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:22:47 crc kubenswrapper[4696]: E1203 00:22:47.443077 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:22:47 crc kubenswrapper[4696]: I1203 00:22:47.445673 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" path="/var/lib/kubelet/pods/a65078bf-36b0-484a-9392-96fe7c180c2a/volumes" Dec 03 00:22:50 crc kubenswrapper[4696]: I1203 00:22:50.919813 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-58dc7_bfd55522-63bd-40f3-a429-eb0c85fe5b9c/control-plane-machine-set-operator/0.log" Dec 03 00:22:51 crc kubenswrapper[4696]: I1203 00:22:51.121320 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rttcw_ffa64292-b071-4bfc-93d6-70d65b00847d/kube-rbac-proxy/0.log" Dec 03 00:22:51 crc kubenswrapper[4696]: I1203 00:22:51.131301 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rttcw_ffa64292-b071-4bfc-93d6-70d65b00847d/machine-api-operator/0.log" Dec 03 00:23:00 crc kubenswrapper[4696]: I1203 00:23:00.432944 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:23:00 crc kubenswrapper[4696]: E1203 00:23:00.434082 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:23:03 crc kubenswrapper[4696]: I1203 00:23:03.861785 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-jbtws_cbe42e42-7252-40cb-bfe8-7484eb822ff9/cert-manager-controller/0.log" Dec 03 00:23:04 crc kubenswrapper[4696]: I1203 00:23:04.010711 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-n8ntr_2af9e90d-fb84-4f01-9ed3-c0c1eaef6369/cert-manager-cainjector/0.log" Dec 03 00:23:04 crc kubenswrapper[4696]: I1203 00:23:04.079605 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jlqs7_72d4a613-3c9c-4b7d-a840-3c76247572f6/cert-manager-webhook/0.log" Dec 03 00:23:12 crc kubenswrapper[4696]: I1203 00:23:12.431842 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:23:12 crc kubenswrapper[4696]: E1203 00:23:12.432802 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:23:16 crc kubenswrapper[4696]: I1203 00:23:16.511239 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-nl682_16ae587c-763d-46f6-b211-e9b3752339c9/nmstate-console-plugin/0.log" Dec 03 00:23:16 crc kubenswrapper[4696]: I1203 00:23:16.707960 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8mz82_e69b657f-75dd-418a-80f8-1e3820f1ff88/nmstate-handler/0.log" Dec 03 00:23:16 crc kubenswrapper[4696]: I1203 00:23:16.737465 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lcb5l_52790af0-09aa-4b8f-8350-054135e80896/kube-rbac-proxy/0.log" Dec 03 00:23:16 crc kubenswrapper[4696]: I1203 00:23:16.740259 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lcb5l_52790af0-09aa-4b8f-8350-054135e80896/nmstate-metrics/0.log" Dec 03 00:23:16 crc kubenswrapper[4696]: I1203 00:23:16.979377 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-cts9d_98262b9a-2be3-48d1-becc-84c3e9585c46/nmstate-operator/0.log" Dec 03 00:23:17 crc kubenswrapper[4696]: I1203 00:23:17.024951 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-srjgk_0301b6ea-801b-41a5-b96a-018412c37fc8/nmstate-webhook/0.log" Dec 03 00:23:24 crc kubenswrapper[4696]: I1203 00:23:24.432490 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:23:24 crc kubenswrapper[4696]: E1203 00:23:24.434803 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:23:31 crc kubenswrapper[4696]: I1203 00:23:31.608327 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-86pps_e4a5e393-9801-4de5-86b3-ac2cb60bcdae/kube-rbac-proxy/0.log" Dec 03 00:23:31 crc kubenswrapper[4696]: I1203 00:23:31.674204 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-86pps_e4a5e393-9801-4de5-86b3-ac2cb60bcdae/controller/0.log" Dec 03 00:23:31 crc kubenswrapper[4696]: I1203 00:23:31.862004 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.064681 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.066237 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.108660 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.135303 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.302430 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.333763 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.357151 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.373767 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.590052 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-frr-files/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.602638 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-reloader/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.607134 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/cp-metrics/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.648693 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/controller/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.785257 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/frr-metrics/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.790737 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/kube-rbac-proxy/0.log" Dec 03 00:23:32 crc kubenswrapper[4696]: I1203 00:23:32.876063 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/kube-rbac-proxy-frr/0.log" Dec 03 00:23:33 crc kubenswrapper[4696]: I1203 00:23:33.045766 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/reloader/0.log" Dec 03 00:23:33 crc kubenswrapper[4696]: I1203 00:23:33.095847 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p5cdf_bc16ac0a-e284-468e-b6a9-a8b78572ac06/frr-k8s-webhook-server/0.log" Dec 03 00:23:33 crc kubenswrapper[4696]: I1203 00:23:33.309255 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c67fd5d6c-gjrcg_978a6167-34da-4d05-a693-a9f7f4d865b2/manager/0.log" Dec 03 00:23:33 crc kubenswrapper[4696]: I1203 00:23:33.584021 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c7867ffbb-nxw6n_4e9c6038-441a-483a-b7e3-ff298010cf18/webhook-server/0.log" Dec 03 00:23:33 crc kubenswrapper[4696]: I1203 00:23:33.676190 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xqrp_58e7c36f-4f09-4ae1-99ce-e18c2612b6ec/kube-rbac-proxy/0.log" Dec 03 00:23:34 crc kubenswrapper[4696]: I1203 00:23:34.344392 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xqrp_58e7c36f-4f09-4ae1-99ce-e18c2612b6ec/speaker/0.log" Dec 03 00:23:34 crc kubenswrapper[4696]: I1203 00:23:34.841254 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jwtm6_a4a3825b-89ac-43dc-b2cf-6f9df48d98d9/frr/0.log" Dec 03 00:23:39 crc kubenswrapper[4696]: I1203 00:23:39.432518 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:23:39 crc kubenswrapper[4696]: E1203 00:23:39.433600 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.238020 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.483578 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.488736 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.503827 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.670784 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/util/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.734327 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/pull/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.736356 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7f559_5c7fcd36-0d45-4703-89ab-df95e3ff5804/extract/0.log" Dec 03 00:23:47 crc kubenswrapper[4696]: I1203 00:23:47.882741 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.066078 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.088720 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.107499 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.279334 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.309957 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/extract/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.319013 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210tj6d4_62846e70-8410-4122-8d1a-f05e0ac36cc9/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.464961 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.701399 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.702885 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.720701 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.872645 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/util/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.873318 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/pull/0.log" Dec 03 00:23:48 crc kubenswrapper[4696]: I1203 00:23:48.905455 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83pt4w2_97c440c0-e159-4a54-a3b4-eb53d72ae698/extract/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.070186 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.265938 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.277316 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.287538 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.512994 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-utilities/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.522603 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/extract-content/0.log" Dec 03 00:23:49 crc kubenswrapper[4696]: I1203 00:23:49.789127 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.025523 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.051954 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.058264 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.336518 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wq2sm_74551327-8467-4679-9951-5dd7042e2a45/registry-server/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.392796 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-utilities/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.407681 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/extract-content/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.759450 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fqzv2_09917c11-8312-4f5a-9597-ad0570d0aeb0/marketplace-operator/0.log" Dec 03 00:23:50 crc kubenswrapper[4696]: I1203 00:23:50.948159 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.191171 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.227205 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.248691 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.505441 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-content/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.515560 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lq2n7_288c2740-d410-436e-a43f-e9522208e1f1/registry-server/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.519633 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/extract-utilities/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.721778 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.831007 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tnm82_f3c3ea23-4c15-4817-a777-29afe63f580f/registry-server/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.939084 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.945006 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:23:51 crc kubenswrapper[4696]: I1203 00:23:51.947216 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:23:52 crc kubenswrapper[4696]: I1203 00:23:52.165428 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-content/0.log" Dec 03 00:23:52 crc kubenswrapper[4696]: I1203 00:23:52.175722 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/extract-utilities/0.log" Dec 03 00:23:52 crc kubenswrapper[4696]: I1203 00:23:52.901002 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nl95f_cae1c862-711b-4fe3-b6a3-f2fefa39b14c/registry-server/0.log" Dec 03 00:23:54 crc kubenswrapper[4696]: I1203 00:23:54.432583 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:23:54 crc kubenswrapper[4696]: E1203 00:23:54.433304 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:24:04 crc kubenswrapper[4696]: I1203 00:24:04.966006 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-4rjbb_ea569052-61d0-4847-90f1-3e085d6a5363/prometheus-operator/0.log" Dec 03 00:24:05 crc kubenswrapper[4696]: I1203 00:24:05.165242 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bfc855f8c-28cln_e25bae4c-bb72-4fe7-8f1b-f6e61100727c/prometheus-operator-admission-webhook/0.log" Dec 03 00:24:05 crc kubenswrapper[4696]: I1203 00:24:05.197906 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bfc855f8c-b9nzn_6cef468e-8250-42c5-8ae4-75dccc1b10a5/prometheus-operator-admission-webhook/0.log" Dec 03 00:24:05 crc kubenswrapper[4696]: I1203 00:24:05.398256 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rc522_9ee2e5e1-ad58-448a-973e-2207d5cde11b/operator/0.log" Dec 03 00:24:05 crc kubenswrapper[4696]: I1203 00:24:05.454018 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-t8wvk_20a6ffb2-7272-4be4-9ed2-ba78389166d6/perses-operator/0.log" Dec 03 00:24:07 crc kubenswrapper[4696]: I1203 00:24:07.441903 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:24:07 crc kubenswrapper[4696]: E1203 00:24:07.442793 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:24:19 crc kubenswrapper[4696]: I1203 00:24:19.431660 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:24:19 crc kubenswrapper[4696]: E1203 00:24:19.432724 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:24:30 crc kubenswrapper[4696]: I1203 00:24:30.432174 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:24:30 crc kubenswrapper[4696]: E1203 00:24:30.433226 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:24:41 crc kubenswrapper[4696]: I1203 00:24:41.439460 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:24:41 crc kubenswrapper[4696]: E1203 00:24:41.440924 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:24:56 crc kubenswrapper[4696]: I1203 00:24:56.431919 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:24:56 crc kubenswrapper[4696]: E1203 00:24:56.433089 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:25:07 crc kubenswrapper[4696]: I1203 00:25:07.438951 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:25:07 crc kubenswrapper[4696]: E1203 00:25:07.440220 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:25:19 crc kubenswrapper[4696]: I1203 00:25:19.433915 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:25:19 crc kubenswrapper[4696]: E1203 00:25:19.438626 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:25:34 crc kubenswrapper[4696]: I1203 00:25:34.432147 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:25:34 crc kubenswrapper[4696]: E1203 00:25:34.433298 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:25:46 crc kubenswrapper[4696]: I1203 00:25:46.431830 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:25:46 crc kubenswrapper[4696]: E1203 00:25:46.432820 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:25:57 crc kubenswrapper[4696]: I1203 00:25:57.443566 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:25:57 crc kubenswrapper[4696]: E1203 00:25:57.444573 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:26:09 crc kubenswrapper[4696]: I1203 00:26:09.432415 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:26:09 crc kubenswrapper[4696]: E1203 00:26:09.435531 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:26:15 crc kubenswrapper[4696]: I1203 00:26:15.240842 4696 generic.go:334] "Generic (PLEG): container finished" podID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerID="4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff" exitCode=0 Dec 03 00:26:15 crc kubenswrapper[4696]: I1203 00:26:15.241036 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-85248/must-gather-mjfr4" event={"ID":"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a","Type":"ContainerDied","Data":"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff"} Dec 03 00:26:15 crc kubenswrapper[4696]: I1203 00:26:15.243322 4696 scope.go:117] "RemoveContainer" containerID="4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff" Dec 03 00:26:15 crc kubenswrapper[4696]: I1203 00:26:15.474216 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-85248_must-gather-mjfr4_9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a/gather/0.log" Dec 03 00:26:21 crc kubenswrapper[4696]: I1203 00:26:21.432015 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:26:21 crc kubenswrapper[4696]: E1203 00:26:21.433105 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:26:26 crc kubenswrapper[4696]: I1203 00:26:26.501622 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-85248/must-gather-mjfr4"] Dec 03 00:26:26 crc kubenswrapper[4696]: I1203 00:26:26.502819 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-85248/must-gather-mjfr4" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="copy" containerID="cri-o://fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec" gracePeriod=2 Dec 03 00:26:26 crc kubenswrapper[4696]: I1203 00:26:26.518002 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-85248/must-gather-mjfr4"] Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.001788 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-85248_must-gather-mjfr4_9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a/copy/0.log" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.003168 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.175401 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25khs\" (UniqueName: \"kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs\") pod \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.175533 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output\") pod \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\" (UID: \"9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a\") " Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.197618 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs" (OuterVolumeSpecName: "kube-api-access-25khs") pod "9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" (UID: "9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a"). InnerVolumeSpecName "kube-api-access-25khs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.278374 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25khs\" (UniqueName: \"kubernetes.io/projected/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-kube-api-access-25khs\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.366043 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" (UID: "9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.381697 4696 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.424873 4696 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-85248_must-gather-mjfr4_9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a/copy/0.log" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.425577 4696 generic.go:334] "Generic (PLEG): container finished" podID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerID="fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec" exitCode=143 Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.425641 4696 scope.go:117] "RemoveContainer" containerID="fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.425809 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-85248/must-gather-mjfr4" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.449340 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" path="/var/lib/kubelet/pods/9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a/volumes" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.455955 4696 scope.go:117] "RemoveContainer" containerID="4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.554068 4696 scope.go:117] "RemoveContainer" containerID="fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec" Dec 03 00:26:27 crc kubenswrapper[4696]: E1203 00:26:27.555224 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec\": container with ID starting with fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec not found: ID does not exist" containerID="fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.555296 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec"} err="failed to get container status \"fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec\": rpc error: code = NotFound desc = could not find container \"fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec\": container with ID starting with fd73ac95957c34e8e1be107936aa4ed1efcd4e585891b3475163fb2e00ce4aec not found: ID does not exist" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.555340 4696 scope.go:117] "RemoveContainer" containerID="4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff" Dec 03 00:26:27 crc kubenswrapper[4696]: E1203 00:26:27.555762 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff\": container with ID starting with 4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff not found: ID does not exist" containerID="4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff" Dec 03 00:26:27 crc kubenswrapper[4696]: I1203 00:26:27.555829 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff"} err="failed to get container status \"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff\": rpc error: code = NotFound desc = could not find container \"4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff\": container with ID starting with 4efb3f55a69d97207ef5806c36d94320a3f1394ee243ae0d10e956d73401d7ff not found: ID does not exist" Dec 03 00:26:34 crc kubenswrapper[4696]: I1203 00:26:34.432047 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:26:34 crc kubenswrapper[4696]: E1203 00:26:34.433461 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:26:39 crc kubenswrapper[4696]: I1203 00:26:39.969858 4696 scope.go:117] "RemoveContainer" containerID="f2c1ac2b8a20135badaa3fccd1c2aa56f295871b700907caf8e205237c5fbeaf" Dec 03 00:26:45 crc kubenswrapper[4696]: I1203 00:26:45.433837 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:26:45 crc kubenswrapper[4696]: E1203 00:26:45.434688 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:27:00 crc kubenswrapper[4696]: I1203 00:27:00.432680 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:27:00 crc kubenswrapper[4696]: E1203 00:27:00.434217 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:27:14 crc kubenswrapper[4696]: I1203 00:27:14.432309 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:27:14 crc kubenswrapper[4696]: E1203 00:27:14.433515 4696 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-chq65_openshift-machine-config-operator(53353260-c7c9-435c-91eb-3d5a1b441c4a)\"" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" Dec 03 00:27:28 crc kubenswrapper[4696]: I1203 00:27:28.431825 4696 scope.go:117] "RemoveContainer" containerID="1063405401fd3e6749f5e5ad03d2ffa1ed3069796fcfb6e3add08f006a670077" Dec 03 00:27:29 crc kubenswrapper[4696]: I1203 00:27:29.156163 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-chq65" event={"ID":"53353260-c7c9-435c-91eb-3d5a1b441c4a","Type":"ContainerStarted","Data":"4430a0f2dbcecfd18b1ce07b91629ecc4137f2747289b08d9ea3380f913bb917"} Dec 03 00:27:40 crc kubenswrapper[4696]: I1203 00:27:40.047634 4696 scope.go:117] "RemoveContainer" containerID="0b62c3d535756f8b952fb5b4f775e98cc65784afcf2f0c3153dc94b3b9e3c5bb" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.296364 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:28:59 crc kubenswrapper[4696]: E1203 00:28:59.297751 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="registry-server" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.297768 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="registry-server" Dec 03 00:28:59 crc kubenswrapper[4696]: E1203 00:28:59.297781 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="gather" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.297787 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="gather" Dec 03 00:28:59 crc kubenswrapper[4696]: E1203 00:28:59.297822 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="copy" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.297829 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="copy" Dec 03 00:28:59 crc kubenswrapper[4696]: E1203 00:28:59.297842 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="extract-content" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.297848 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="extract-content" Dec 03 00:28:59 crc kubenswrapper[4696]: E1203 00:28:59.297858 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="extract-utilities" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.297864 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="extract-utilities" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.298053 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="copy" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.298068 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7e85f5-b3bb-4e2b-b8a9-61a4a7a0993a" containerName="gather" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.298077 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65078bf-36b0-484a-9392-96fe7c180c2a" containerName="registry-server" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.299829 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.312343 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.469600 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.471993 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.482271 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.483634 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.484106 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.484237 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fhkd\" (UniqueName: \"kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.586793 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.586880 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fhkd\" (UniqueName: \"kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.586947 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4dc\" (UniqueName: \"kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.586993 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.587126 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.587237 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.588256 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.588372 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.612036 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fhkd\" (UniqueName: \"kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd\") pod \"redhat-marketplace-jx5sg\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.640318 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.689594 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.689665 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4dc\" (UniqueName: \"kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.689718 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.690339 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.690605 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.709105 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4dc\" (UniqueName: \"kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc\") pod \"redhat-operators-qpwsg\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:28:59 crc kubenswrapper[4696]: I1203 00:28:59.801499 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:00 crc kubenswrapper[4696]: I1203 00:29:00.251662 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:29:00 crc kubenswrapper[4696]: I1203 00:29:00.458050 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:29:00 crc kubenswrapper[4696]: W1203 00:29:00.465535 4696 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda650bdd8_c903_45cf_8e50_32d7bed2272a.slice/crio-62241253421a00b8ecc2515803d93ccdd17e76c889472847ca151da4e0116d51 WatchSource:0}: Error finding container 62241253421a00b8ecc2515803d93ccdd17e76c889472847ca151da4e0116d51: Status 404 returned error can't find the container with id 62241253421a00b8ecc2515803d93ccdd17e76c889472847ca151da4e0116d51 Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.133315 4696 generic.go:334] "Generic (PLEG): container finished" podID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerID="1a202e402000f9a1d80b38316ae8926e23909760ebe4b178d5c8cca3e2960ed4" exitCode=0 Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.133409 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerDied","Data":"1a202e402000f9a1d80b38316ae8926e23909760ebe4b178d5c8cca3e2960ed4"} Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.133466 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerStarted","Data":"1a8ea24c65ee4672055a45f85982a112105f1ba871cadb557187795b18358093"} Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.136375 4696 generic.go:334] "Generic (PLEG): container finished" podID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerID="909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48" exitCode=0 Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.136403 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerDied","Data":"909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48"} Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.136434 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerStarted","Data":"62241253421a00b8ecc2515803d93ccdd17e76c889472847ca151da4e0116d51"} Dec 03 00:29:01 crc kubenswrapper[4696]: I1203 00:29:01.137066 4696 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:29:04 crc kubenswrapper[4696]: I1203 00:29:04.173766 4696 generic.go:334] "Generic (PLEG): container finished" podID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerID="9f4372f3ed3fac9369abf7c7bc24e77c45de4147ff80059f4f2f946dd0212e84" exitCode=0 Dec 03 00:29:04 crc kubenswrapper[4696]: I1203 00:29:04.173875 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerDied","Data":"9f4372f3ed3fac9369abf7c7bc24e77c45de4147ff80059f4f2f946dd0212e84"} Dec 03 00:29:05 crc kubenswrapper[4696]: I1203 00:29:05.190542 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerStarted","Data":"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19"} Dec 03 00:29:05 crc kubenswrapper[4696]: I1203 00:29:05.193917 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerStarted","Data":"65e8213a961ba6595a40ca914b66cff4ada178c883c5f43d1ba5a584b44a3227"} Dec 03 00:29:05 crc kubenswrapper[4696]: I1203 00:29:05.241460 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jx5sg" podStartSLOduration=2.753315516 podStartE2EDuration="6.241438153s" podCreationTimestamp="2025-12-03 00:28:59 +0000 UTC" firstStartedPulling="2025-12-03 00:29:01.136815202 +0000 UTC m=+6404.017495203" lastFinishedPulling="2025-12-03 00:29:04.624937839 +0000 UTC m=+6407.505617840" observedRunningTime="2025-12-03 00:29:05.237580144 +0000 UTC m=+6408.118260145" watchObservedRunningTime="2025-12-03 00:29:05.241438153 +0000 UTC m=+6408.122118154" Dec 03 00:29:07 crc kubenswrapper[4696]: I1203 00:29:07.239347 4696 generic.go:334] "Generic (PLEG): container finished" podID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerID="105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19" exitCode=0 Dec 03 00:29:07 crc kubenswrapper[4696]: I1203 00:29:07.239720 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerDied","Data":"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19"} Dec 03 00:29:08 crc kubenswrapper[4696]: I1203 00:29:08.251662 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerStarted","Data":"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7"} Dec 03 00:29:08 crc kubenswrapper[4696]: I1203 00:29:08.279802 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qpwsg" podStartSLOduration=3.83351415 podStartE2EDuration="9.279780506s" podCreationTimestamp="2025-12-03 00:28:59 +0000 UTC" firstStartedPulling="2025-12-03 00:29:02.438000402 +0000 UTC m=+6405.318680403" lastFinishedPulling="2025-12-03 00:29:07.884266758 +0000 UTC m=+6410.764946759" observedRunningTime="2025-12-03 00:29:08.274348392 +0000 UTC m=+6411.155028393" watchObservedRunningTime="2025-12-03 00:29:08.279780506 +0000 UTC m=+6411.160460507" Dec 03 00:29:09 crc kubenswrapper[4696]: I1203 00:29:09.640853 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:09 crc kubenswrapper[4696]: I1203 00:29:09.640934 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:09 crc kubenswrapper[4696]: I1203 00:29:09.712099 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:09 crc kubenswrapper[4696]: I1203 00:29:09.802520 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:09 crc kubenswrapper[4696]: I1203 00:29:09.802602 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:10 crc kubenswrapper[4696]: I1203 00:29:10.334468 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:10 crc kubenswrapper[4696]: I1203 00:29:10.652464 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:29:10 crc kubenswrapper[4696]: I1203 00:29:10.870243 4696 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qpwsg" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="registry-server" probeResult="failure" output=< Dec 03 00:29:10 crc kubenswrapper[4696]: timeout: failed to connect service ":50051" within 1s Dec 03 00:29:10 crc kubenswrapper[4696]: > Dec 03 00:29:12 crc kubenswrapper[4696]: I1203 00:29:12.301939 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jx5sg" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="registry-server" containerID="cri-o://65e8213a961ba6595a40ca914b66cff4ada178c883c5f43d1ba5a584b44a3227" gracePeriod=2 Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.345130 4696 generic.go:334] "Generic (PLEG): container finished" podID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerID="65e8213a961ba6595a40ca914b66cff4ada178c883c5f43d1ba5a584b44a3227" exitCode=0 Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.345199 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerDied","Data":"65e8213a961ba6595a40ca914b66cff4ada178c883c5f43d1ba5a584b44a3227"} Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.712417 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.840029 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content\") pod \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.840118 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities\") pod \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.840228 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fhkd\" (UniqueName: \"kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd\") pod \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\" (UID: \"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0\") " Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.842343 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities" (OuterVolumeSpecName: "utilities") pod "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" (UID: "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.850393 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd" (OuterVolumeSpecName: "kube-api-access-2fhkd") pod "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" (UID: "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0"). InnerVolumeSpecName "kube-api-access-2fhkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.883719 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" (UID: "dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.942726 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.942789 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:14 crc kubenswrapper[4696]: I1203 00:29:14.942803 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fhkd\" (UniqueName: \"kubernetes.io/projected/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0-kube-api-access-2fhkd\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.360123 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jx5sg" event={"ID":"dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0","Type":"ContainerDied","Data":"1a8ea24c65ee4672055a45f85982a112105f1ba871cadb557187795b18358093"} Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.360187 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jx5sg" Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.360221 4696 scope.go:117] "RemoveContainer" containerID="65e8213a961ba6595a40ca914b66cff4ada178c883c5f43d1ba5a584b44a3227" Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.391335 4696 scope.go:117] "RemoveContainer" containerID="9f4372f3ed3fac9369abf7c7bc24e77c45de4147ff80059f4f2f946dd0212e84" Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.407872 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.420266 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jx5sg"] Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.420486 4696 scope.go:117] "RemoveContainer" containerID="1a202e402000f9a1d80b38316ae8926e23909760ebe4b178d5c8cca3e2960ed4" Dec 03 00:29:15 crc kubenswrapper[4696]: I1203 00:29:15.480598 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" path="/var/lib/kubelet/pods/dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0/volumes" Dec 03 00:29:19 crc kubenswrapper[4696]: I1203 00:29:19.869169 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:19 crc kubenswrapper[4696]: I1203 00:29:19.922597 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:20 crc kubenswrapper[4696]: I1203 00:29:20.112796 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:29:21 crc kubenswrapper[4696]: I1203 00:29:21.425986 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qpwsg" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="registry-server" containerID="cri-o://e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7" gracePeriod=2 Dec 03 00:29:21 crc kubenswrapper[4696]: I1203 00:29:21.888615 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.009492 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content\") pod \"a650bdd8-c903-45cf-8e50-32d7bed2272a\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.010068 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities\") pod \"a650bdd8-c903-45cf-8e50-32d7bed2272a\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.010338 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4dc\" (UniqueName: \"kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc\") pod \"a650bdd8-c903-45cf-8e50-32d7bed2272a\" (UID: \"a650bdd8-c903-45cf-8e50-32d7bed2272a\") " Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.010892 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities" (OuterVolumeSpecName: "utilities") pod "a650bdd8-c903-45cf-8e50-32d7bed2272a" (UID: "a650bdd8-c903-45cf-8e50-32d7bed2272a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.011175 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.023239 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc" (OuterVolumeSpecName: "kube-api-access-nn4dc") pod "a650bdd8-c903-45cf-8e50-32d7bed2272a" (UID: "a650bdd8-c903-45cf-8e50-32d7bed2272a"). InnerVolumeSpecName "kube-api-access-nn4dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.117370 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4dc\" (UniqueName: \"kubernetes.io/projected/a650bdd8-c903-45cf-8e50-32d7bed2272a-kube-api-access-nn4dc\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.132296 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a650bdd8-c903-45cf-8e50-32d7bed2272a" (UID: "a650bdd8-c903-45cf-8e50-32d7bed2272a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.219961 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a650bdd8-c903-45cf-8e50-32d7bed2272a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.441582 4696 generic.go:334] "Generic (PLEG): container finished" podID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerID="e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7" exitCode=0 Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.441637 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpwsg" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.441639 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerDied","Data":"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7"} Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.441774 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpwsg" event={"ID":"a650bdd8-c903-45cf-8e50-32d7bed2272a","Type":"ContainerDied","Data":"62241253421a00b8ecc2515803d93ccdd17e76c889472847ca151da4e0116d51"} Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.441799 4696 scope.go:117] "RemoveContainer" containerID="e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.470907 4696 scope.go:117] "RemoveContainer" containerID="105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.480710 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.491079 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qpwsg"] Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.500522 4696 scope.go:117] "RemoveContainer" containerID="909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.543676 4696 scope.go:117] "RemoveContainer" containerID="e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7" Dec 03 00:29:22 crc kubenswrapper[4696]: E1203 00:29:22.544169 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7\": container with ID starting with e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7 not found: ID does not exist" containerID="e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.544199 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7"} err="failed to get container status \"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7\": rpc error: code = NotFound desc = could not find container \"e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7\": container with ID starting with e95c93510ee93d29591299b01b8422fcbe1b5a2ddab58c4d2c44b94989a99cd7 not found: ID does not exist" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.544225 4696 scope.go:117] "RemoveContainer" containerID="105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19" Dec 03 00:29:22 crc kubenswrapper[4696]: E1203 00:29:22.544539 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19\": container with ID starting with 105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19 not found: ID does not exist" containerID="105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.544562 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19"} err="failed to get container status \"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19\": rpc error: code = NotFound desc = could not find container \"105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19\": container with ID starting with 105c8fa81cb5b45aa7692f522cb1314db9abfa0e5e08a40e208314c483ceea19 not found: ID does not exist" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.544576 4696 scope.go:117] "RemoveContainer" containerID="909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48" Dec 03 00:29:22 crc kubenswrapper[4696]: E1203 00:29:22.544808 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48\": container with ID starting with 909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48 not found: ID does not exist" containerID="909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48" Dec 03 00:29:22 crc kubenswrapper[4696]: I1203 00:29:22.544834 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48"} err="failed to get container status \"909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48\": rpc error: code = NotFound desc = could not find container \"909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48\": container with ID starting with 909d595c77c44fdd301bfbd21824da8a1a87eb4318de985a6359fd71022d9e48 not found: ID does not exist" Dec 03 00:29:23 crc kubenswrapper[4696]: I1203 00:29:23.449155 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" path="/var/lib/kubelet/pods/a650bdd8-c903-45cf-8e50-32d7bed2272a/volumes" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.548631 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.549831 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="extract-utilities" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.549851 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="extract-utilities" Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.549880 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="extract-content" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.549889 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="extract-content" Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.549911 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="extract-utilities" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.549922 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="extract-utilities" Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.549946 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.549954 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.549966 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="extract-content" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.549974 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="extract-content" Dec 03 00:29:42 crc kubenswrapper[4696]: E1203 00:29:42.550005 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.550013 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.550280 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd18528-bcc0-4f1b-9c1c-b68ae9c4c4d0" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.550301 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="a650bdd8-c903-45cf-8e50-32d7bed2272a" containerName="registry-server" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.552784 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.570996 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.626726 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.626873 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.626958 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.729135 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.729628 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.729776 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.729821 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.730008 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.753508 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8\") pod \"certified-operators-rbv8w\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:42 crc kubenswrapper[4696]: I1203 00:29:42.886491 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:43 crc kubenswrapper[4696]: I1203 00:29:43.419259 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:43 crc kubenswrapper[4696]: I1203 00:29:43.704931 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerStarted","Data":"34dff6cb9b1193dc0dc1a094b84d2c92d70bdbfb82828d590002b4aff990a862"} Dec 03 00:29:44 crc kubenswrapper[4696]: I1203 00:29:44.717990 4696 generic.go:334] "Generic (PLEG): container finished" podID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerID="3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414" exitCode=0 Dec 03 00:29:44 crc kubenswrapper[4696]: I1203 00:29:44.718095 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerDied","Data":"3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414"} Dec 03 00:29:45 crc kubenswrapper[4696]: I1203 00:29:45.730670 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerStarted","Data":"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091"} Dec 03 00:29:46 crc kubenswrapper[4696]: I1203 00:29:46.744975 4696 generic.go:334] "Generic (PLEG): container finished" podID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerID="89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091" exitCode=0 Dec 03 00:29:46 crc kubenswrapper[4696]: I1203 00:29:46.745606 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerDied","Data":"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091"} Dec 03 00:29:47 crc kubenswrapper[4696]: I1203 00:29:47.782983 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerStarted","Data":"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d"} Dec 03 00:29:47 crc kubenswrapper[4696]: I1203 00:29:47.812244 4696 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rbv8w" podStartSLOduration=3.257668115 podStartE2EDuration="5.81221486s" podCreationTimestamp="2025-12-03 00:29:42 +0000 UTC" firstStartedPulling="2025-12-03 00:29:44.719871449 +0000 UTC m=+6447.600551440" lastFinishedPulling="2025-12-03 00:29:47.274418184 +0000 UTC m=+6450.155098185" observedRunningTime="2025-12-03 00:29:47.810034108 +0000 UTC m=+6450.690714129" watchObservedRunningTime="2025-12-03 00:29:47.81221486 +0000 UTC m=+6450.692894881" Dec 03 00:29:52 crc kubenswrapper[4696]: I1203 00:29:52.887274 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:52 crc kubenswrapper[4696]: I1203 00:29:52.888179 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:52 crc kubenswrapper[4696]: I1203 00:29:52.951392 4696 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:52 crc kubenswrapper[4696]: I1203 00:29:52.973551 4696 patch_prober.go:28] interesting pod/machine-config-daemon-chq65 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:29:52 crc kubenswrapper[4696]: I1203 00:29:52.973625 4696 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-chq65" podUID="53353260-c7c9-435c-91eb-3d5a1b441c4a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:29:53 crc kubenswrapper[4696]: I1203 00:29:53.895065 4696 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:53 crc kubenswrapper[4696]: I1203 00:29:53.951299 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:55 crc kubenswrapper[4696]: I1203 00:29:55.859904 4696 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rbv8w" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="registry-server" containerID="cri-o://2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d" gracePeriod=2 Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.642150 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.758721 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8\") pod \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.758855 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content\") pod \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.759067 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities\") pod \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\" (UID: \"abde34d1-0fd3-4c3f-891f-fc47810f27ea\") " Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.760984 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities" (OuterVolumeSpecName: "utilities") pod "abde34d1-0fd3-4c3f-891f-fc47810f27ea" (UID: "abde34d1-0fd3-4c3f-891f-fc47810f27ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.766221 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8" (OuterVolumeSpecName: "kube-api-access-zrsn8") pod "abde34d1-0fd3-4c3f-891f-fc47810f27ea" (UID: "abde34d1-0fd3-4c3f-891f-fc47810f27ea"). InnerVolumeSpecName "kube-api-access-zrsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.819674 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abde34d1-0fd3-4c3f-891f-fc47810f27ea" (UID: "abde34d1-0fd3-4c3f-891f-fc47810f27ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.861103 4696 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.861138 4696 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abde34d1-0fd3-4c3f-891f-fc47810f27ea-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.861150 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/abde34d1-0fd3-4c3f-891f-fc47810f27ea-kube-api-access-zrsn8\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.871001 4696 generic.go:334] "Generic (PLEG): container finished" podID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerID="2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d" exitCode=0 Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.871050 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerDied","Data":"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d"} Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.871081 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbv8w" event={"ID":"abde34d1-0fd3-4c3f-891f-fc47810f27ea","Type":"ContainerDied","Data":"34dff6cb9b1193dc0dc1a094b84d2c92d70bdbfb82828d590002b4aff990a862"} Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.871100 4696 scope.go:117] "RemoveContainer" containerID="2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.871115 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbv8w" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.902459 4696 scope.go:117] "RemoveContainer" containerID="89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.928058 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.940172 4696 scope.go:117] "RemoveContainer" containerID="3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.940854 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rbv8w"] Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.980888 4696 scope.go:117] "RemoveContainer" containerID="2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d" Dec 03 00:29:56 crc kubenswrapper[4696]: E1203 00:29:56.981976 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d\": container with ID starting with 2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d not found: ID does not exist" containerID="2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.982037 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d"} err="failed to get container status \"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d\": rpc error: code = NotFound desc = could not find container \"2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d\": container with ID starting with 2c6bfc45ccd6b4fb49e310130775452fbf73efe82ac843093121b01620e5a21d not found: ID does not exist" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.982078 4696 scope.go:117] "RemoveContainer" containerID="89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091" Dec 03 00:29:56 crc kubenswrapper[4696]: E1203 00:29:56.983143 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091\": container with ID starting with 89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091 not found: ID does not exist" containerID="89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.983178 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091"} err="failed to get container status \"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091\": rpc error: code = NotFound desc = could not find container \"89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091\": container with ID starting with 89657d86c1c4926f94e444a115d35ac7cbe916cba129778a008b41f45ae47091 not found: ID does not exist" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.983200 4696 scope.go:117] "RemoveContainer" containerID="3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414" Dec 03 00:29:56 crc kubenswrapper[4696]: E1203 00:29:56.983824 4696 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414\": container with ID starting with 3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414 not found: ID does not exist" containerID="3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414" Dec 03 00:29:56 crc kubenswrapper[4696]: I1203 00:29:56.983880 4696 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414"} err="failed to get container status \"3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414\": rpc error: code = NotFound desc = could not find container \"3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414\": container with ID starting with 3840e358863e45ec77db62b9ded1dfbcbadb5505e7cd347d20e91ee7ca57c414 not found: ID does not exist" Dec 03 00:29:57 crc kubenswrapper[4696]: I1203 00:29:57.449486 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" path="/var/lib/kubelet/pods/abde34d1-0fd3-4c3f-891f-fc47810f27ea/volumes" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.156497 4696 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5"] Dec 03 00:30:00 crc kubenswrapper[4696]: E1203 00:30:00.157791 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.157807 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4696]: E1203 00:30:00.157842 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.157849 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4696]: E1203 00:30:00.157878 4696 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.157884 4696 state_mem.go:107] "Deleted CPUSet assignment" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.158102 4696 memory_manager.go:354] "RemoveStaleState removing state" podUID="abde34d1-0fd3-4c3f-891f-fc47810f27ea" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.158962 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.161840 4696 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.164934 4696 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.168052 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5"] Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.238957 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.239054 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v49gb\" (UniqueName: \"kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.239184 4696 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.340076 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v49gb\" (UniqueName: \"kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.340983 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.341125 4696 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.343002 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.349010 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.356992 4696 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v49gb\" (UniqueName: \"kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb\") pod \"collect-profiles-29412030-txhl5\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.487107 4696 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:00 crc kubenswrapper[4696]: I1203 00:30:00.975410 4696 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5"] Dec 03 00:30:01 crc kubenswrapper[4696]: I1203 00:30:01.942839 4696 generic.go:334] "Generic (PLEG): container finished" podID="07fd37c4-dd88-42a6-8d31-431b30f1be33" containerID="5e12d347da6414524676fcc56638c1f64f39a4995324eed6082435a52a7e2d4e" exitCode=0 Dec 03 00:30:01 crc kubenswrapper[4696]: I1203 00:30:01.942830 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" event={"ID":"07fd37c4-dd88-42a6-8d31-431b30f1be33","Type":"ContainerDied","Data":"5e12d347da6414524676fcc56638c1f64f39a4995324eed6082435a52a7e2d4e"} Dec 03 00:30:01 crc kubenswrapper[4696]: I1203 00:30:01.943330 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" event={"ID":"07fd37c4-dd88-42a6-8d31-431b30f1be33","Type":"ContainerStarted","Data":"bb71bd908553a28a2e0c38fa693e1bbe4ade85ccd3eecf5aafdecd61f7802479"} Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.315930 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.511412 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume\") pod \"07fd37c4-dd88-42a6-8d31-431b30f1be33\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.511887 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v49gb\" (UniqueName: \"kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb\") pod \"07fd37c4-dd88-42a6-8d31-431b30f1be33\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.511945 4696 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume\") pod \"07fd37c4-dd88-42a6-8d31-431b30f1be33\" (UID: \"07fd37c4-dd88-42a6-8d31-431b30f1be33\") " Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.512502 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume" (OuterVolumeSpecName: "config-volume") pod "07fd37c4-dd88-42a6-8d31-431b30f1be33" (UID: "07fd37c4-dd88-42a6-8d31-431b30f1be33"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.513286 4696 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07fd37c4-dd88-42a6-8d31-431b30f1be33-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.519499 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07fd37c4-dd88-42a6-8d31-431b30f1be33" (UID: "07fd37c4-dd88-42a6-8d31-431b30f1be33"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.520253 4696 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb" (OuterVolumeSpecName: "kube-api-access-v49gb") pod "07fd37c4-dd88-42a6-8d31-431b30f1be33" (UID: "07fd37c4-dd88-42a6-8d31-431b30f1be33"). InnerVolumeSpecName "kube-api-access-v49gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.639855 4696 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v49gb\" (UniqueName: \"kubernetes.io/projected/07fd37c4-dd88-42a6-8d31-431b30f1be33-kube-api-access-v49gb\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.639914 4696 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07fd37c4-dd88-42a6-8d31-431b30f1be33-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.966183 4696 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" event={"ID":"07fd37c4-dd88-42a6-8d31-431b30f1be33","Type":"ContainerDied","Data":"bb71bd908553a28a2e0c38fa693e1bbe4ade85ccd3eecf5aafdecd61f7802479"} Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.966532 4696 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb71bd908553a28a2e0c38fa693e1bbe4ade85ccd3eecf5aafdecd61f7802479" Dec 03 00:30:03 crc kubenswrapper[4696]: I1203 00:30:03.966324 4696 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-txhl5" Dec 03 00:30:04 crc kubenswrapper[4696]: I1203 00:30:04.395258 4696 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x"] Dec 03 00:30:04 crc kubenswrapper[4696]: I1203 00:30:04.407359 4696 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-59q7x"] Dec 03 00:30:05 crc kubenswrapper[4696]: I1203 00:30:05.442559 4696 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0" path="/var/lib/kubelet/pods/190fc3cd-f0ad-4e7e-83b6-d5c20b3372a0/volumes"